Apr 22 19:20:39.989719 ip-10-0-133-159 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 19:20:39.989731 ip-10-0-133-159 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 19:20:39.989738 ip-10-0-133-159 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 19:20:39.989963 ip-10-0-133-159 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 19:20:50.214432 ip-10-0-133-159 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 19:20:50.214446 ip-10-0-133-159 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 857eac73ceae4efd9c7cd4e54c2348dd -- Apr 22 19:23:17.069072 ip-10-0-133-159 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:23:17.542595 ip-10-0-133-159 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:17.542595 ip-10-0-133-159 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:23:17.542595 ip-10-0-133-159 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:17.542595 ip-10-0-133-159 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:23:17.542595 ip-10-0-133-159 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:17.544564 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.544464 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:23:17.547890 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547874 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:17.547890 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547889 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547893 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547896 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547899 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547902 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547905 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547908 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547911 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547914 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547921 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547924 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547927 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547930 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547932 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547935 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547937 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547940 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547943 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:17.547961 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547967 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547972 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547976 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547979 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547983 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547986 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547989 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547992 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547995 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.547997 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548000 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548003 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548005 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548008 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548011 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548014 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548016 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548019 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548021 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548024 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:17.548400 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548027 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548029 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548033 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548035 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548038 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548040 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548042 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548045 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548047 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548050 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548052 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548055 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548057 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548060 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548063 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548066 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548068 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548071 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548073 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548076 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:17.548905 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548078 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548081 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548083 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548086 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548089 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548091 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548094 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548099 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548103 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548114 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548118 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548121 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548124 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548127 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548130 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548133 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548136 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548139 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548142 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:17.549434 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548146 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548149 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548151 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548154 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548156 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548160 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548163 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548166 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548594 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548599 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548602 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548605 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548607 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548610 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548613 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548615 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548618 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548621 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548623 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548626 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:17.549924 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548629 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548632 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548634 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548638 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548640 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548643 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548645 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548648 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548650 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548653 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548655 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548660 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548663 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548667 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548670 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548673 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548676 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548679 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548682 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548685 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:17.550414 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548689 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548691 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548694 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548697 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548700 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548704 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548707 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548709 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548712 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548714 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548717 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548720 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548722 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548740 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548743 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548746 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548748 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548751 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548753 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548756 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:17.550932 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548758 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548761 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548763 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548766 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548770 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548772 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548775 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548777 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548780 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548783 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548786 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548790 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548792 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548796 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548798 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548801 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548803 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548807 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548810 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548813 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:17.551431 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548815 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548818 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548820 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548823 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548825 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548828 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548831 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548833 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548835 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548838 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548840 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548843 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548845 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.548848 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551159 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551171 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551182 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551187 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551192 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551196 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551201 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:23:17.551962 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551206 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551209 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551212 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551216 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551219 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551223 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551226 2579 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551229 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551232 2579 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551235 2579 flags.go:64] FLAG: --cloud-config="" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551238 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551241 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551247 2579 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551250 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551254 2579 flags.go:64] FLAG: --config-dir="" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551256 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551260 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551265 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551275 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551278 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551282 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551285 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551288 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551291 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551294 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:23:17.552490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551299 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551303 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551307 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551310 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551313 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551316 2579 flags.go:64] FLAG: --enable-server="true" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551318 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551326 2579 flags.go:64] FLAG: --event-burst="100" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551329 2579 flags.go:64] FLAG: --event-qps="50" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551333 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551336 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551339 2579 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551343 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551346 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551349 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551352 2579 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551355 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551358 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551361 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551364 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551367 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551370 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551373 2579 flags.go:64] FLAG: --feature-gates="" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551377 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551381 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:23:17.553130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551384 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551393 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551396 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551399 2579 flags.go:64] FLAG: --help="false" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551403 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-133-159.ec2.internal" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551406 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551409 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551413 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551416 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551420 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551423 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551426 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551429 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551432 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551435 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551438 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551441 2579 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551445 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551447 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551451 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551454 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551457 2579 flags.go:64] FLAG: --lock-file="" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551459 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551462 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:23:17.553774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551465 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551470 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551473 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551477 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551480 2579 flags.go:64] FLAG: --logging-format="text" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551483 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551486 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551489 2579 flags.go:64] FLAG: --manifest-url="" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551493 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551497 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551501 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551505 2579 flags.go:64] FLAG: --max-pods="110" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551509 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551512 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551515 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551519 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551522 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551525 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551528 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551536 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551539 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551542 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551545 2579 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:23:17.554405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551548 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551555 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551558 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551561 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551564 2579 flags.go:64] FLAG: --port="10250" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551568 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551570 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02d5480159bfe3726" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551574 2579 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551577 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551580 2579 flags.go:64] FLAG: --register-node="true" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551583 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551585 2579 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551590 2579 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551593 2579 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551595 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551599 2579 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551602 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551605 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551609 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551612 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551615 2579 flags.go:64] FLAG: --runonce="false" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551618 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551621 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551624 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551627 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551631 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:23:17.555010 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551634 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551637 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551640 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551643 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551646 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551649 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551652 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551655 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551658 2579 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551661 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551666 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551669 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551672 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551678 2579 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551681 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551684 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551687 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551690 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551694 2579 flags.go:64] FLAG: --v="2" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551699 2579 flags.go:64] FLAG: --version="false" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551703 2579 flags.go:64] FLAG: --vmodule="" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551707 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.551711 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551827 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:17.555632 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551832 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551836 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551840 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551843 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551846 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551849 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551852 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551856 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551858 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551861 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551864 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551866 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551869 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551872 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551874 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551877 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551879 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551882 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551885 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:17.556240 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551887 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551890 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551892 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551895 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551897 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551900 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551902 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551905 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551908 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551911 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551915 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551919 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551922 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551925 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551927 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551930 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551933 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551936 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551938 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:17.556723 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551941 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551944 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551946 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551949 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551951 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551954 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551957 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551959 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551962 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551965 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551967 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551970 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551973 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551976 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551978 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551981 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551984 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551986 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551989 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551991 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:17.557223 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551994 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.551998 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552000 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552003 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552006 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552008 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552011 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552013 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552016 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552019 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552021 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552031 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552034 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552037 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552040 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552043 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552045 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552048 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552050 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552053 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:17.557788 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552055 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:17.558294 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552058 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:17.558294 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552060 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:17.558294 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552062 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:17.558294 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552066 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:17.558294 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552068 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:17.558294 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.552071 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:17.558294 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.552515 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:17.559229 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.559209 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:23:17.559274 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.559230 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:23:17.559307 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559281 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:17.559307 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559286 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:17.559307 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559289 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:17.559307 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559293 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:17.559307 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559295 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:17.559307 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559298 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:17.559307 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559301 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:17.559307 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559304 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:17.559307 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559307 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:17.559307 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559310 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559313 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559316 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559318 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559321 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559323 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559326 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559329 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559331 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559334 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559337 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559339 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559342 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559345 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559348 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559350 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559353 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559355 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559358 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559360 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:17.559569 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559363 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559366 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559369 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559372 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559375 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559377 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559380 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559383 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559386 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559388 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559391 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559393 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559397 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559402 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559405 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559408 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559411 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559414 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559417 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:17.560080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559420 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559423 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559426 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559428 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559431 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559434 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559436 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559439 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559442 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559445 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559447 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559450 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559452 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559455 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559458 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559461 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559464 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559466 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559469 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559471 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:17.560551 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559474 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559477 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559479 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559481 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559484 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559487 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559491 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559495 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559498 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559501 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559504 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559506 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559509 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559512 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559514 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559517 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559520 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:17.561062 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559522 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.559527 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559634 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559639 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559642 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559645 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559648 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559652 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559656 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559659 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559662 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559665 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559668 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559671 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559673 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:17.561483 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559676 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559678 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559681 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559683 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559686 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559689 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559691 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559695 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559698 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559700 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559703 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559706 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559708 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559711 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559713 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559716 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559719 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559721 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559739 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559743 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:17.561893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559746 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559748 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559751 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559754 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559757 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559759 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559762 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559765 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559768 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559770 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559780 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559783 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559785 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559788 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559790 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559793 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559795 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559798 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559800 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559804 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:17.562393 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559807 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559809 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559812 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559815 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559818 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559820 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559823 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559825 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559828 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559830 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559833 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559836 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559838 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559841 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559843 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559846 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559848 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559851 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559853 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559856 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:17.563009 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559859 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559861 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559864 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559872 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559875 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559877 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559880 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559882 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559885 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559889 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559892 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559896 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:17.559898 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.559904 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.560764 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:23:17.563504 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.562923 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:23:17.564065 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.564053 2579 server.go:1019] "Starting client certificate rotation" Apr 22 19:23:17.564167 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.564148 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:17.564207 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.564198 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:17.593906 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.593882 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:17.600021 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.599996 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:17.615612 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.615587 2579 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:23:17.621259 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.621237 2579 log.go:25] "Validated CRI v1 image API" Apr 22 19:23:17.622492 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.622466 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:23:17.627946 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.627917 2579 fs.go:135] Filesystem UUIDs: map[3b03b8cd-55c9-42c3-8fe5-b96bc9d9fe32:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9f561255-a4ec-4093-b1ac-3cf42fc740da:/dev/nvme0n1p4] Apr 22 19:23:17.627946 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.627942 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:23:17.629134 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.629115 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:17.633630 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.633509 2579 manager.go:217] Machine: {Timestamp:2026-04-22 19:23:17.631711112 +0000 UTC m=+0.436552427 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100822 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2caa8300875be49eba16018cee33b3 SystemUUID:ec2caa83-0087-5be4-9eba-16018cee33b3 BootID:857eac73-ceae-4efd-9c7c-d4e54c2348dd Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3c:5a:c7:b1:af Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3c:5a:c7:b1:af Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4e:1a:27:98:6d:cd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:23:17.633630 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.633620 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:23:17.633784 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.633707 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:23:17.636345 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.636316 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:23:17.636498 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.636347 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-159.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:23:17.636544 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.636507 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:23:17.636544 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.636518 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:23:17.636544 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.636535 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:17.637306 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.637295 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:17.638794 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.638783 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:17.638917 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.638908 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:23:17.641364 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.641353 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:23:17.641407 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.641368 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:23:17.641407 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.641402 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:23:17.641478 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.641424 2579 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:23:17.641478 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.641442 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:23:17.642666 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.642654 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:17.642703 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.642672 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:17.645845 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.645828 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:23:17.647067 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.647052 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:23:17.648455 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.648438 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:23:17.648455 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.648456 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:23:17.648574 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.648462 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:23:17.648574 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.648467 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:23:17.648574 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.648474 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:23:17.648574 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.648480 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:23:17.648574 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.648486 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:23:17.648574 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.648491 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:23:17.648574 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.648498 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:23:17.648574 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.648504 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:23:17.648574 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.648512 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:23:17.648574 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.648521 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:23:17.649139 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.649127 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:23:17.649139 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.649136 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:23:17.652811 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.652788 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:23:17.653250 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.653232 2579 server.go:1295] "Started kubelet" Apr 22 19:23:17.654086 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.654017 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:23:17.654177 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.654094 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:23:17.654177 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.654149 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:23:17.654314 ip-10-0-133-159 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:23:17.654442 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.654417 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-159.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:23:17.654496 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.654459 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-159.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:23:17.654496 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.654471 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:23:17.655093 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.654830 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rpqts" Apr 22 19:23:17.655888 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.655874 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:23:17.657289 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.657272 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:23:17.660921 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.660897 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rpqts" Apr 22 19:23:17.663563 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.663533 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:23:17.664114 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.664096 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:23:17.664114 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.664108 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:17.664624 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.664605 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:23:17.664624 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.664623 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:23:17.664821 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.664681 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:23:17.664821 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.664792 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:23:17.664821 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.664800 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:23:17.665014 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.664996 2579 factory.go:55] Registering systemd factory Apr 22 19:23:17.665014 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.665016 2579 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:23:17.665180 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.665121 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:17.665232 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.665204 2579 factory.go:153] Registering CRI-O factory Apr 22 19:23:17.665232 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.665220 2579 factory.go:223] Registration of the crio container factory successfully Apr 22 19:23:17.665323 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.665268 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:23:17.665323 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.665294 2579 factory.go:103] Registering Raw factory Apr 22 19:23:17.665323 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.665308 2579 manager.go:1196] Started watching for new ooms in manager Apr 22 19:23:17.665678 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.665664 2579 manager.go:319] Starting recovery of all containers Apr 22 19:23:17.667764 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.667722 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:17.672766 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.672623 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-159.ec2.internal\" not found" node="ip-10-0-133-159.ec2.internal" Apr 22 19:23:17.675249 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.675231 2579 manager.go:324] Recovery completed Apr 22 19:23:17.680151 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.680137 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:17.684108 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.684089 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:17.684178 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.684121 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:17.684178 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.684136 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:17.684683 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.684668 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:23:17.684683 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.684682 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:23:17.684848 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.684703 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:17.687577 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.687563 2579 policy_none.go:49] "None policy: Start" Apr 22 19:23:17.687649 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.687582 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:23:17.687649 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.687595 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:23:17.725318 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.725301 2579 manager.go:341] "Starting Device Plugin manager" Apr 22 19:23:17.747281 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.725342 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:23:17.747281 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.725355 2579 server.go:85] "Starting device plugin registration server" Apr 22 19:23:17.747281 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.725657 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:23:17.747281 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.725670 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:23:17.747281 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.725774 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:23:17.747281 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.725861 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:23:17.747281 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.725871 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:23:17.747281 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.726396 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:23:17.747281 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.726442 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:17.799313 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.799231 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:23:17.800445 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.800431 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:23:17.800505 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.800455 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:23:17.800505 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.800474 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:23:17.800505 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.800480 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:23:17.800665 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.800521 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:23:17.804096 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.804078 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:17.826352 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.826325 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:17.827517 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.827500 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:17.827614 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.827530 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:17.827614 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.827543 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:17.827614 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.827571 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-159.ec2.internal" Apr 22 19:23:17.836887 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.836867 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-159.ec2.internal" Apr 22 19:23:17.836959 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.836896 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-159.ec2.internal\": node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:17.849825 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.849802 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:17.900824 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.900767 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-159.ec2.internal"] Apr 22 19:23:17.901036 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.900889 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:17.901978 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.901961 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:17.902063 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.901992 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:17.902063 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.902002 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:17.904372 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.904358 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:17.904553 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.904536 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal" Apr 22 19:23:17.904655 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.904574 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:17.905153 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.905135 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:17.905257 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.905164 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:17.905257 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.905177 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:17.905257 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.905137 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:17.905257 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.905236 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:17.905257 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.905247 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:17.907440 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.907424 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-159.ec2.internal" Apr 22 19:23:17.907530 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.907449 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:17.908216 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.908200 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:17.908305 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.908232 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:17.908305 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.908247 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:17.927509 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.927481 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-159.ec2.internal\" not found" node="ip-10-0-133-159.ec2.internal" Apr 22 19:23:17.931883 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.931864 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-159.ec2.internal\" not found" node="ip-10-0-133-159.ec2.internal" Apr 22 19:23:17.950307 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:17.950285 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:17.966610 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.966585 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b82df689b1e19e43cfff5d00b46485d2-config\") pod \"kube-apiserver-proxy-ip-10-0-133-159.ec2.internal\" (UID: \"b82df689b1e19e43cfff5d00b46485d2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-159.ec2.internal" Apr 22 19:23:17.966707 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.966614 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/842ff22705529194224b4584003f8941-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal\" (UID: \"842ff22705529194224b4584003f8941\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal" Apr 22 19:23:17.966707 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:17.966633 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/842ff22705529194224b4584003f8941-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal\" (UID: \"842ff22705529194224b4584003f8941\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal" Apr 22 19:23:18.050834 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:18.050755 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:18.067336 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.067309 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/842ff22705529194224b4584003f8941-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal\" (UID: \"842ff22705529194224b4584003f8941\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal" Apr 22 19:23:18.067415 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.067347 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/842ff22705529194224b4584003f8941-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal\" (UID: \"842ff22705529194224b4584003f8941\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal" Apr 22 19:23:18.067415 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.067366 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b82df689b1e19e43cfff5d00b46485d2-config\") pod \"kube-apiserver-proxy-ip-10-0-133-159.ec2.internal\" (UID: \"b82df689b1e19e43cfff5d00b46485d2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-159.ec2.internal" Apr 22 19:23:18.067516 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.067420 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/842ff22705529194224b4584003f8941-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal\" (UID: \"842ff22705529194224b4584003f8941\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal" Apr 22 19:23:18.067516 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.067438 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/842ff22705529194224b4584003f8941-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal\" (UID: \"842ff22705529194224b4584003f8941\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal" Apr 22 19:23:18.067516 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.067465 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b82df689b1e19e43cfff5d00b46485d2-config\") pod \"kube-apiserver-proxy-ip-10-0-133-159.ec2.internal\" (UID: \"b82df689b1e19e43cfff5d00b46485d2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-159.ec2.internal" Apr 22 19:23:18.151772 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:18.151741 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:18.229239 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.229207 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal" Apr 22 19:23:18.234674 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.234645 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-159.ec2.internal" Apr 22 19:23:18.252867 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:18.252840 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:18.353318 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:18.353285 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:18.453840 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:18.453803 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:18.502647 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.502616 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:18.554960 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:18.554928 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:18.564111 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.564086 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:23:18.564255 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.564235 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:18.564294 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.564245 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:18.564294 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.564245 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:18.655104 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:18.655023 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:18.662861 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.662810 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:18:17 +0000 UTC" deadline="2027-12-11 02:16:55.145459416 +0000 UTC" Apr 22 19:23:18.662861 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.662856 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14334h53m36.482607081s" Apr 22 19:23:18.664442 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.664426 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:18.674572 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.674536 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:18.693944 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.693921 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9bm8f" Apr 22 19:23:18.699774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.699746 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9bm8f" Apr 22 19:23:18.755796 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:18.755760 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:18.815080 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:18.815047 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod842ff22705529194224b4584003f8941.slice/crio-da7622e148a4e2077764df7854c19d2fba44adacd6d37372082c7430ec05099a WatchSource:0}: Error finding container da7622e148a4e2077764df7854c19d2fba44adacd6d37372082c7430ec05099a: Status 404 returned error can't find the container with id da7622e148a4e2077764df7854c19d2fba44adacd6d37372082c7430ec05099a Apr 22 19:23:18.815321 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:18.815301 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb82df689b1e19e43cfff5d00b46485d2.slice/crio-1c290c84a7ef946b93a7f0b0613d53e1c4f511221cd8f707efb9b8c4326e6489 WatchSource:0}: Error finding container 1c290c84a7ef946b93a7f0b0613d53e1c4f511221cd8f707efb9b8c4326e6489: Status 404 returned error can't find the container with id 1c290c84a7ef946b93a7f0b0613d53e1c4f511221cd8f707efb9b8c4326e6489 Apr 22 19:23:18.820707 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:18.819869 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:23:18.856837 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:18.856801 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:18.957399 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:18.957316 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-159.ec2.internal\" not found" Apr 22 19:23:19.044563 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.044526 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:19.065100 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.065071 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal" Apr 22 19:23:19.079622 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.079597 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:19.081372 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.081358 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-159.ec2.internal" Apr 22 19:23:19.095941 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.095920 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:19.529200 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.529166 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:19.642095 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.642051 2579 apiserver.go:52] "Watching apiserver" Apr 22 19:23:19.650173 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.650144 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:23:19.650647 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.650622 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-rpmcm","openshift-cluster-node-tuning-operator/tuned-4ntkh","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal","openshift-multus/multus-additional-cni-plugins-hzp7f","openshift-multus/network-metrics-daemon-qwbg8","openshift-network-diagnostics/network-check-target-xkr7v","openshift-network-operator/iptables-alerter-tjgwl","openshift-ovn-kubernetes/ovnkube-node-2crp2","kube-system/kube-apiserver-proxy-ip-10-0-133-159.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc","openshift-dns/node-resolver-5bh88","openshift-image-registry/node-ca-drfzw","openshift-multus/multus-t8zwd"] Apr 22 19:23:19.653231 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.653208 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tjgwl" Apr 22 19:23:19.655794 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.655769 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-d7xfq\"" Apr 22 19:23:19.656136 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.656115 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:19.656251 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.656158 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:23:19.656251 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.656159 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:19.657790 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.657705 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.658341 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.658003 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.660185 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.659971 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:19.660185 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:19.660063 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:19.660341 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.660200 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-xn94d\"" Apr 22 19:23:19.660341 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.660321 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:19.660483 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.660463 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:23:19.660640 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.660626 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:23:19.660695 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.660652 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rswps\"" Apr 22 19:23:19.660763 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.660463 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:23:19.660926 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.660908 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:23:19.661361 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.661340 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:19.661578 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.661557 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:23:19.662339 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.662317 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:19.662431 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:19.662383 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:19.664555 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.664534 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rpmcm" Apr 22 19:23:19.666947 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.666932 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:23:19.667195 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.667171 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:23:19.667356 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.667315 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x9h4c\"" Apr 22 19:23:19.671413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.669827 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.671413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.669834 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.673269 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.672628 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5bh88" Apr 22 19:23:19.673269 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.672884 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-z2btp\"" Apr 22 19:23:19.673269 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.673102 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:23:19.673269 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.673135 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:23:19.673548 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.673286 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:23:19.673548 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.673394 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:23:19.673548 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.673439 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vrcmd\"" Apr 22 19:23:19.674635 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.674613 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:23:19.674635 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.674628 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:23:19.674825 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.674631 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:23:19.674825 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.674758 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:23:19.674955 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.674878 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:23:19.675086 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.675070 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:23:19.675588 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.675414 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:23:19.675588 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.675494 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-b9xkx\"" Apr 22 19:23:19.676379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.675868 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-modprobe-d\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.676379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.675958 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/468346a2-9e30-4adf-916d-475adc66b11c-tmp\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.676379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676015 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-system-cni-dir\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.676379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:19.676379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676071 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-cni-netd\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.676379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676098 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e451824a-2133-4364-b91f-8b08929198a3-host-slash\") pod \"iptables-alerter-tjgwl\" (UID: \"e451824a-2133-4364-b91f-8b08929198a3\") " pod="openshift-network-operator/iptables-alerter-tjgwl" Apr 22 19:23:19.676379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-kubernetes\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.676379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-systemd\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.676379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676236 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-cni-binary-copy\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.676379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676272 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdx4h\" (UniqueName: \"kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h\") pod \"network-check-target-xkr7v\" (UID: \"5037e972-6e10-4b21-bde5-a072bf744013\") " pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:19.676379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676298 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-run-systemd\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.676379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676327 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-etc-openvswitch\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.676379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676348 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-cni-bin\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676397 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-sysctl-conf\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676452 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgrgq\" (UniqueName: \"kubernetes.io/projected/9c0f6922-8799-4caa-adfb-fa958fee9291-kube-api-access-wgrgq\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676503 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-systemd-units\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676530 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-var-lib-openvswitch\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676557 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxbp4\" (UniqueName: \"kubernetes.io/projected/9a484ef5-ac14-4ff2-ab99-82238200be07-kube-api-access-zxbp4\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676666 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-sysconfig\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676741 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-sys\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676789 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-lib-modules\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676814 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-host\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676857 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-slash\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-559rr\" (UniqueName: \"kubernetes.io/projected/468346a2-9e30-4adf-916d-475adc66b11c-kube-api-access-559rr\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676911 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-os-release\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.676957 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677003 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-run-openvswitch\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677058 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677050 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-run-ovn\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677077 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-node-log\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677116 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677144 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a484ef5-ac14-4ff2-ab99-82238200be07-ovn-node-metrics-cert\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677168 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-var-lib-kubelet\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677236 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-cnibin\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677273 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677308 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw7cd\" (UniqueName: \"kubernetes.io/projected/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-kube-api-access-pw7cd\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677332 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-kubelet\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677370 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-run-netns\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677404 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-log-socket\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677427 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/468346a2-9e30-4adf-916d-475adc66b11c-etc-tuned\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677453 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-run-ovn-kubernetes\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677567 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a484ef5-ac14-4ff2-ab99-82238200be07-ovnkube-config\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677590 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e451824a-2133-4364-b91f-8b08929198a3-iptables-alerter-script\") pod \"iptables-alerter-tjgwl\" (UID: \"e451824a-2133-4364-b91f-8b08929198a3\") " pod="openshift-network-operator/iptables-alerter-tjgwl" Apr 22 19:23:19.677709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677626 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/76a63f32-8306-496d-ab47-f0ec1293937f-agent-certs\") pod \"konnectivity-agent-rpmcm\" (UID: \"76a63f32-8306-496d-ab47-f0ec1293937f\") " pod="kube-system/konnectivity-agent-rpmcm" Apr 22 19:23:19.678723 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677679 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-sysctl-d\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.678723 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677716 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-run\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.678723 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a484ef5-ac14-4ff2-ab99-82238200be07-env-overrides\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.678723 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677793 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a484ef5-ac14-4ff2-ab99-82238200be07-ovnkube-script-lib\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.678723 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677818 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vx25\" (UniqueName: \"kubernetes.io/projected/e451824a-2133-4364-b91f-8b08929198a3-kube-api-access-8vx25\") pod \"iptables-alerter-tjgwl\" (UID: \"e451824a-2133-4364-b91f-8b08929198a3\") " pod="openshift-network-operator/iptables-alerter-tjgwl" Apr 22 19:23:19.678723 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677855 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/76a63f32-8306-496d-ab47-f0ec1293937f-konnectivity-ca\") pod \"konnectivity-agent-rpmcm\" (UID: \"76a63f32-8306-496d-ab47-f0ec1293937f\") " pod="kube-system/konnectivity-agent-rpmcm" Apr 22 19:23:19.678723 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677941 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.678723 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.677952 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-drfzw" Apr 22 19:23:19.680269 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.680242 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-t7m4l\"" Apr 22 19:23:19.680845 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.680416 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:23:19.680845 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.680482 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:23:19.680845 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.680510 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:23:19.680845 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.680620 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:23:19.680845 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.680695 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-x97sx\"" Apr 22 19:23:19.700495 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.700454 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:18 +0000 UTC" deadline="2028-02-05 18:42:28.411409802 +0000 UTC" Apr 22 19:23:19.700495 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.700493 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15695h19m8.710919805s" Apr 22 19:23:19.765469 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.765441 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:23:19.778822 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.778572 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-559rr\" (UniqueName: \"kubernetes.io/projected/468346a2-9e30-4adf-916d-475adc66b11c-kube-api-access-559rr\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.779020 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.778953 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-os-release\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.779020 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779005 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.779129 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779040 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-run-openvswitch\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.779129 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779073 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-node-log\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.779129 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779104 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-multus-socket-dir-parent\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.779295 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779137 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-socket-dir\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.779295 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779168 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8qfp\" (UniqueName: \"kubernetes.io/projected/73366f6c-dc1e-4c5b-a1f3-e3d7839a351e-kube-api-access-m8qfp\") pod \"node-resolver-5bh88\" (UID: \"73366f6c-dc1e-4c5b-a1f3-e3d7839a351e\") " pod="openshift-dns/node-resolver-5bh88" Apr 22 19:23:19.779295 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779202 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-cnibin\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.779295 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779242 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.779295 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779255 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.779295 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-log-socket\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.779556 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779317 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e451824a-2133-4364-b91f-8b08929198a3-iptables-alerter-script\") pod \"iptables-alerter-tjgwl\" (UID: \"e451824a-2133-4364-b91f-8b08929198a3\") " pod="openshift-network-operator/iptables-alerter-tjgwl" Apr 22 19:23:19.779556 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779336 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-os-release\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.779556 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779346 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/468346a2-9e30-4adf-916d-475adc66b11c-etc-tuned\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.779556 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779377 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a484ef5-ac14-4ff2-ab99-82238200be07-ovnkube-config\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.779556 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779398 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-run-openvswitch\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.779556 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779409 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/54216124-3633-4740-9592-06c935cb0781-serviceca\") pod \"node-ca-drfzw\" (UID: \"54216124-3633-4740-9592-06c935cb0781\") " pod="openshift-image-registry/node-ca-drfzw" Apr 22 19:23:19.779556 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779449 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-node-log\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.779556 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779484 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-multus-cni-dir\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.779934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779557 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-var-lib-cni-multus\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.779934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779594 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-cnibin\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.779934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779629 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/468346a2-9e30-4adf-916d-475adc66b11c-tmp\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.779934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779659 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-system-cni-dir\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.779934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779690 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:19.779934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779746 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e451824a-2133-4364-b91f-8b08929198a3-host-slash\") pod \"iptables-alerter-tjgwl\" (UID: \"e451824a-2133-4364-b91f-8b08929198a3\") " pod="openshift-network-operator/iptables-alerter-tjgwl" Apr 22 19:23:19.779934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779779 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-run-k8s-cni-cncf-io\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.779934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779811 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-systemd\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.779934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779844 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-cni-binary-copy\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.779934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779879 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdx4h\" (UniqueName: \"kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h\") pod \"network-check-target-xkr7v\" (UID: \"5037e972-6e10-4b21-bde5-a072bf744013\") " pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:19.779934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779908 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-run-systemd\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779958 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-hostroot\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779983 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-multus-conf-dir\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780013 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-multus-daemon-config\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780045 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/73366f6c-dc1e-4c5b-a1f3-e3d7839a351e-tmp-dir\") pod \"node-resolver-5bh88\" (UID: \"73366f6c-dc1e-4c5b-a1f3-e3d7839a351e\") " pod="openshift-dns/node-resolver-5bh88" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780056 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780082 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-systemd-units\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780115 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxbp4\" (UniqueName: \"kubernetes.io/projected/9a484ef5-ac14-4ff2-ab99-82238200be07-kube-api-access-zxbp4\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780145 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54216124-3633-4740-9592-06c935cb0781-host\") pod \"node-ca-drfzw\" (UID: \"54216124-3633-4740-9592-06c935cb0781\") " pod="openshift-image-registry/node-ca-drfzw" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780169 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-os-release\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780145 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-log-socket\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780202 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-sys-fs\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-sysconfig\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780263 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-sys\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780291 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-lib-modules\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780358 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm654\" (UniqueName: \"kubernetes.io/projected/54216124-3633-4740-9592-06c935cb0781-kube-api-access-pm654\") pod \"node-ca-drfzw\" (UID: \"54216124-3633-4740-9592-06c935cb0781\") " pod="openshift-image-registry/node-ca-drfzw" Apr 22 19:23:19.780413 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780398 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-cni-binary-copy\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.781117 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780425 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-run-ovn\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.781117 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780460 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.781117 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780493 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a484ef5-ac14-4ff2-ab99-82238200be07-ovn-node-metrics-cert\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.781117 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780664 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e451824a-2133-4364-b91f-8b08929198a3-iptables-alerter-script\") pod \"iptables-alerter-tjgwl\" (UID: \"e451824a-2133-4364-b91f-8b08929198a3\") " pod="openshift-network-operator/iptables-alerter-tjgwl" Apr 22 19:23:19.781117 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780526 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vx25\" (UniqueName: \"kubernetes.io/projected/e451824a-2133-4364-b91f-8b08929198a3-kube-api-access-8vx25\") pod \"iptables-alerter-tjgwl\" (UID: \"e451824a-2133-4364-b91f-8b08929198a3\") " pod="openshift-network-operator/iptables-alerter-tjgwl" Apr 22 19:23:19.781117 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780980 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:23:19.781117 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.780996 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/76a63f32-8306-496d-ab47-f0ec1293937f-konnectivity-ca\") pod \"konnectivity-agent-rpmcm\" (UID: \"76a63f32-8306-496d-ab47-f0ec1293937f\") " pod="kube-system/konnectivity-agent-rpmcm" Apr 22 19:23:19.781117 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781025 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-run-netns\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.781117 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781058 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.781117 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-var-lib-kubelet\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.781562 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781126 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw7cd\" (UniqueName: \"kubernetes.io/projected/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-kube-api-access-pw7cd\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.781562 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781157 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-kubelet\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.781562 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781182 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-run-netns\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.781562 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781210 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-run-ovn-kubernetes\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.781562 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781241 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/73366f6c-dc1e-4c5b-a1f3-e3d7839a351e-hosts-file\") pod \"node-resolver-5bh88\" (UID: \"73366f6c-dc1e-4c5b-a1f3-e3d7839a351e\") " pod="openshift-dns/node-resolver-5bh88" Apr 22 19:23:19.781562 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781272 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.781562 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.779485 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-cnibin\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.781562 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781304 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/76a63f32-8306-496d-ab47-f0ec1293937f-agent-certs\") pod \"konnectivity-agent-rpmcm\" (UID: \"76a63f32-8306-496d-ab47-f0ec1293937f\") " pod="kube-system/konnectivity-agent-rpmcm" Apr 22 19:23:19.781974 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781809 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8mzx\" (UniqueName: \"kubernetes.io/projected/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-kube-api-access-p8mzx\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.781974 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781811 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-run-systemd\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.781974 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781837 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-run-netns\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.781974 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781885 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-systemd\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.781974 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781902 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-systemd-units\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.781974 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781932 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-system-cni-dir\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.781974 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781947 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-kubelet\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.782327 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.781991 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-run-ovn-kubernetes\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.782327 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782033 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.782327 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782079 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-lib-modules\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.782327 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-sysctl-d\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.782327 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782229 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-run\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.782327 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782246 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-sysconfig\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.782327 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782289 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9a484ef5-ac14-4ff2-ab99-82238200be07-ovnkube-config\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.782636 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782338 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a484ef5-ac14-4ff2-ab99-82238200be07-env-overrides\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.782636 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782397 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-sys\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.782636 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782402 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e451824a-2133-4364-b91f-8b08929198a3-host-slash\") pod \"iptables-alerter-tjgwl\" (UID: \"e451824a-2133-4364-b91f-8b08929198a3\") " pod="openshift-network-operator/iptables-alerter-tjgwl" Apr 22 19:23:19.782636 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782429 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-cni-binary-copy\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.782636 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782432 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-run-ovn\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.782636 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782477 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-run\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.782636 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782524 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-var-lib-kubelet\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.782636 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:19.782622 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:19.782636 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782628 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.783082 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782657 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a484ef5-ac14-4ff2-ab99-82238200be07-ovnkube-script-lib\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.783082 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782706 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-var-lib-cni-bin\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.783082 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782839 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/76a63f32-8306-496d-ab47-f0ec1293937f-konnectivity-ca\") pod \"konnectivity-agent-rpmcm\" (UID: \"76a63f32-8306-496d-ab47-f0ec1293937f\") " pod="kube-system/konnectivity-agent-rpmcm" Apr 22 19:23:19.783082 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782890 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-etc-kubernetes\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.783082 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:19.782961 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs podName:9c0f6922-8799-4caa-adfb-fa958fee9291 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:20.282916714 +0000 UTC m=+3.087758006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs") pod "network-metrics-daemon-qwbg8" (UID: "9c0f6922-8799-4caa-adfb-fa958fee9291") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:19.783082 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.782999 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-registration-dir\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.783082 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783033 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-etc-selinux\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.783435 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783089 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-modprobe-d\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.783435 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783124 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-cni-netd\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.783435 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783160 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-var-lib-kubelet\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.783435 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783211 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-kubernetes\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.783435 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783270 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-kubernetes\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.783435 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783324 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-etc-openvswitch\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.783435 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783378 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-etc-openvswitch\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.783750 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783475 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-modprobe-d\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.783750 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783561 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-cni-bin\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.783750 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783574 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-cni-netd\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.783750 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783597 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-sysctl-d\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.783750 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783629 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9a484ef5-ac14-4ff2-ab99-82238200be07-env-overrides\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.783750 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783720 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-cni-bin\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.784099 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783692 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r27pf\" (UniqueName: \"kubernetes.io/projected/479bcdb7-ebbf-4317-9949-e55ece55ec17-kube-api-access-r27pf\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.784099 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783823 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-sysctl-conf\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.784099 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783889 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgrgq\" (UniqueName: \"kubernetes.io/projected/9c0f6922-8799-4caa-adfb-fa958fee9291-kube-api-access-wgrgq\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:19.784099 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783913 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9a484ef5-ac14-4ff2-ab99-82238200be07-ovnkube-script-lib\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.784099 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.783980 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-etc-sysctl-conf\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.784099 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.784039 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-var-lib-openvswitch\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.784099 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.784076 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-host\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.784425 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.784184 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-slash\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.784425 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.784221 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-system-cni-dir\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.784425 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.784258 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-run-multus-certs\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.784425 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.784309 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-device-dir\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.784623 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.784451 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-var-lib-openvswitch\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.784623 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.784602 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/468346a2-9e30-4adf-916d-475adc66b11c-host\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.785544 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.784986 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/468346a2-9e30-4adf-916d-475adc66b11c-tmp\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.785651 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.785540 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a484ef5-ac14-4ff2-ab99-82238200be07-host-slash\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.786312 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.786289 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/468346a2-9e30-4adf-916d-475adc66b11c-etc-tuned\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.786641 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.786618 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/76a63f32-8306-496d-ab47-f0ec1293937f-agent-certs\") pod \"konnectivity-agent-rpmcm\" (UID: \"76a63f32-8306-496d-ab47-f0ec1293937f\") " pod="kube-system/konnectivity-agent-rpmcm" Apr 22 19:23:19.786913 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.786892 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a484ef5-ac14-4ff2-ab99-82238200be07-ovn-node-metrics-cert\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.787922 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:19.787855 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:19.787922 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:19.787878 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:19.787922 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:19.787891 2579 projected.go:194] Error preparing data for projected volume kube-api-access-gdx4h for pod openshift-network-diagnostics/network-check-target-xkr7v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:19.788191 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:19.787967 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h podName:5037e972-6e10-4b21-bde5-a072bf744013 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:20.287950131 +0000 UTC m=+3.092791415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gdx4h" (UniqueName: "kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h") pod "network-check-target-xkr7v" (UID: "5037e972-6e10-4b21-bde5-a072bf744013") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:19.788723 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.788648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-559rr\" (UniqueName: \"kubernetes.io/projected/468346a2-9e30-4adf-916d-475adc66b11c-kube-api-access-559rr\") pod \"tuned-4ntkh\" (UID: \"468346a2-9e30-4adf-916d-475adc66b11c\") " pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.790512 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.790476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vx25\" (UniqueName: \"kubernetes.io/projected/e451824a-2133-4364-b91f-8b08929198a3-kube-api-access-8vx25\") pod \"iptables-alerter-tjgwl\" (UID: \"e451824a-2133-4364-b91f-8b08929198a3\") " pod="openshift-network-operator/iptables-alerter-tjgwl" Apr 22 19:23:19.791014 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.790990 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw7cd\" (UniqueName: \"kubernetes.io/projected/1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d-kube-api-access-pw7cd\") pod \"multus-additional-cni-plugins-hzp7f\" (UID: \"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d\") " pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:19.791161 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.791141 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxbp4\" (UniqueName: \"kubernetes.io/projected/9a484ef5-ac14-4ff2-ab99-82238200be07-kube-api-access-zxbp4\") pod \"ovnkube-node-2crp2\" (UID: \"9a484ef5-ac14-4ff2-ab99-82238200be07\") " pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:19.791667 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.791647 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgrgq\" (UniqueName: \"kubernetes.io/projected/9c0f6922-8799-4caa-adfb-fa958fee9291-kube-api-access-wgrgq\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:19.805342 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.805300 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-159.ec2.internal" event={"ID":"b82df689b1e19e43cfff5d00b46485d2","Type":"ContainerStarted","Data":"1c290c84a7ef946b93a7f0b0613d53e1c4f511221cd8f707efb9b8c4326e6489"} Apr 22 19:23:19.806294 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.806270 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal" event={"ID":"842ff22705529194224b4584003f8941","Type":"ContainerStarted","Data":"da7622e148a4e2077764df7854c19d2fba44adacd6d37372082c7430ec05099a"} Apr 22 19:23:19.884648 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884613 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r27pf\" (UniqueName: \"kubernetes.io/projected/479bcdb7-ebbf-4317-9949-e55ece55ec17-kube-api-access-r27pf\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.884648 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884654 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-system-cni-dir\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.884903 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884676 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-run-multus-certs\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.884903 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884698 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-device-dir\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.884903 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884717 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-multus-socket-dir-parent\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.884903 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884755 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-socket-dir\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.884903 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884776 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8qfp\" (UniqueName: \"kubernetes.io/projected/73366f6c-dc1e-4c5b-a1f3-e3d7839a351e-kube-api-access-m8qfp\") pod \"node-resolver-5bh88\" (UID: \"73366f6c-dc1e-4c5b-a1f3-e3d7839a351e\") " pod="openshift-dns/node-resolver-5bh88" Apr 22 19:23:19.884903 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884775 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-run-multus-certs\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.884903 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884797 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-device-dir\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.884903 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884776 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-system-cni-dir\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.884903 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884826 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-multus-socket-dir-parent\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.884903 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884803 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/54216124-3633-4740-9592-06c935cb0781-serviceca\") pod \"node-ca-drfzw\" (UID: \"54216124-3633-4740-9592-06c935cb0781\") " pod="openshift-image-registry/node-ca-drfzw" Apr 22 19:23:19.884903 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884873 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-multus-cni-dir\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.884903 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884902 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-var-lib-cni-multus\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884927 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-cnibin\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884947 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-socket-dir\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884970 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-run-k8s-cni-cncf-io\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.884989 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-var-lib-cni-multus\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885008 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-hostroot\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-multus-cni-dir\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885032 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-multus-conf-dir\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885037 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-cnibin\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885056 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-multus-daemon-config\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885078 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-run-k8s-cni-cncf-io\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/73366f6c-dc1e-4c5b-a1f3-e3d7839a351e-tmp-dir\") pod \"node-resolver-5bh88\" (UID: \"73366f6c-dc1e-4c5b-a1f3-e3d7839a351e\") " pod="openshift-dns/node-resolver-5bh88" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885109 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-hostroot\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885147 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-multus-conf-dir\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885154 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/54216124-3633-4740-9592-06c935cb0781-serviceca\") pod \"node-ca-drfzw\" (UID: \"54216124-3633-4740-9592-06c935cb0781\") " pod="openshift-image-registry/node-ca-drfzw" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885215 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54216124-3633-4740-9592-06c935cb0781-host\") pod \"node-ca-drfzw\" (UID: \"54216124-3633-4740-9592-06c935cb0781\") " pod="openshift-image-registry/node-ca-drfzw" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885257 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54216124-3633-4740-9592-06c935cb0781-host\") pod \"node-ca-drfzw\" (UID: \"54216124-3633-4740-9592-06c935cb0781\") " pod="openshift-image-registry/node-ca-drfzw" Apr 22 19:23:19.885272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885261 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-os-release\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885300 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-sys-fs\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885333 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pm654\" (UniqueName: \"kubernetes.io/projected/54216124-3633-4740-9592-06c935cb0781-kube-api-access-pm654\") pod \"node-ca-drfzw\" (UID: \"54216124-3633-4740-9592-06c935cb0781\") " pod="openshift-image-registry/node-ca-drfzw" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885338 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-os-release\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885372 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/73366f6c-dc1e-4c5b-a1f3-e3d7839a351e-tmp-dir\") pod \"node-resolver-5bh88\" (UID: \"73366f6c-dc1e-4c5b-a1f3-e3d7839a351e\") " pod="openshift-dns/node-resolver-5bh88" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885382 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-cni-binary-copy\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885395 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-sys-fs\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885421 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-run-netns\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885453 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885483 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/73366f6c-dc1e-4c5b-a1f3-e3d7839a351e-hosts-file\") pod \"node-resolver-5bh88\" (UID: \"73366f6c-dc1e-4c5b-a1f3-e3d7839a351e\") " pod="openshift-dns/node-resolver-5bh88" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885490 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-run-netns\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885513 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8mzx\" (UniqueName: \"kubernetes.io/projected/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-kube-api-access-p8mzx\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885542 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-var-lib-cni-bin\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885558 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885579 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-etc-kubernetes\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885613 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-registration-dir\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885623 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-etc-kubernetes\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885624 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-multus-daemon-config\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.885893 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885641 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-etc-selinux\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.886498 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885615 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/73366f6c-dc1e-4c5b-a1f3-e3d7839a351e-hosts-file\") pod \"node-resolver-5bh88\" (UID: \"73366f6c-dc1e-4c5b-a1f3-e3d7839a351e\") " pod="openshift-dns/node-resolver-5bh88" Apr 22 19:23:19.886498 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885657 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-registration-dir\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.886498 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885655 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-var-lib-cni-bin\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.886498 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-var-lib-kubelet\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.886498 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885691 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-host-var-lib-kubelet\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.886498 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885783 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/479bcdb7-ebbf-4317-9949-e55ece55ec17-etc-selinux\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.886498 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.885828 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-cni-binary-copy\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.894385 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.894345 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r27pf\" (UniqueName: \"kubernetes.io/projected/479bcdb7-ebbf-4317-9949-e55ece55ec17-kube-api-access-r27pf\") pod \"aws-ebs-csi-driver-node-vmzwc\" (UID: \"479bcdb7-ebbf-4317-9949-e55ece55ec17\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:19.894545 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.894389 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8qfp\" (UniqueName: \"kubernetes.io/projected/73366f6c-dc1e-4c5b-a1f3-e3d7839a351e-kube-api-access-m8qfp\") pod \"node-resolver-5bh88\" (UID: \"73366f6c-dc1e-4c5b-a1f3-e3d7839a351e\") " pod="openshift-dns/node-resolver-5bh88" Apr 22 19:23:19.894609 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.894565 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm654\" (UniqueName: \"kubernetes.io/projected/54216124-3633-4740-9592-06c935cb0781-kube-api-access-pm654\") pod \"node-ca-drfzw\" (UID: \"54216124-3633-4740-9592-06c935cb0781\") " pod="openshift-image-registry/node-ca-drfzw" Apr 22 19:23:19.894609 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.894600 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8mzx\" (UniqueName: \"kubernetes.io/projected/04dcf06d-ab97-49fd-b8b0-d5036c249ae1-kube-api-access-p8mzx\") pod \"multus-t8zwd\" (UID: \"04dcf06d-ab97-49fd-b8b0-d5036c249ae1\") " pod="openshift-multus/multus-t8zwd" Apr 22 19:23:19.976511 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.976480 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tjgwl" Apr 22 19:23:19.985172 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.985144 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" Apr 22 19:23:19.994967 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:19.994942 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hzp7f" Apr 22 19:23:20.001626 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.001605 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rpmcm" Apr 22 19:23:20.009366 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.009217 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" Apr 22 19:23:20.017956 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.017932 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:20.025603 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.025579 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5bh88" Apr 22 19:23:20.033359 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.033297 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t8zwd" Apr 22 19:23:20.036741 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.036708 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:20.038827 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.038805 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-drfzw" Apr 22 19:23:20.289043 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.288952 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:20.289043 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.289001 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdx4h\" (UniqueName: \"kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h\") pod \"network-check-target-xkr7v\" (UID: \"5037e972-6e10-4b21-bde5-a072bf744013\") " pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:20.289262 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:20.289121 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:20.289262 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:20.289136 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:20.289262 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:20.289159 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:20.289262 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:20.289173 2579 projected.go:194] Error preparing data for projected volume kube-api-access-gdx4h for pod openshift-network-diagnostics/network-check-target-xkr7v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:20.289262 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:20.289189 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs podName:9c0f6922-8799-4caa-adfb-fa958fee9291 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:21.289175258 +0000 UTC m=+4.094016540 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs") pod "network-metrics-daemon-qwbg8" (UID: "9c0f6922-8799-4caa-adfb-fa958fee9291") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:20.289262 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:20.289224 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h podName:5037e972-6e10-4b21-bde5-a072bf744013 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:21.289208813 +0000 UTC m=+4.094050095 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gdx4h" (UniqueName: "kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h") pod "network-check-target-xkr7v" (UID: "5037e972-6e10-4b21-bde5-a072bf744013") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:20.520291 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:20.520261 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04dcf06d_ab97_49fd_b8b0_d5036c249ae1.slice/crio-472387ee3b3aaf47c603e3d237fb36d613f07de4123f9d0a3a4a2a50c5b3f40d WatchSource:0}: Error finding container 472387ee3b3aaf47c603e3d237fb36d613f07de4123f9d0a3a4a2a50c5b3f40d: Status 404 returned error can't find the container with id 472387ee3b3aaf47c603e3d237fb36d613f07de4123f9d0a3a4a2a50c5b3f40d Apr 22 19:23:20.521511 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:20.521430 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73366f6c_dc1e_4c5b_a1f3_e3d7839a351e.slice/crio-a38eed59db7a6bd0ec7c8c9b0d4d30d3fcedf364c3264d2b69dec7d221cf1444 WatchSource:0}: Error finding container a38eed59db7a6bd0ec7c8c9b0d4d30d3fcedf364c3264d2b69dec7d221cf1444: Status 404 returned error can't find the container with id a38eed59db7a6bd0ec7c8c9b0d4d30d3fcedf364c3264d2b69dec7d221cf1444 Apr 22 19:23:20.524038 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:20.523965 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod468346a2_9e30_4adf_916d_475adc66b11c.slice/crio-079c5f5cbdc845a818a9c1cd5d196a748a66c2fc7fea1bc5ef142869e4b0db23 WatchSource:0}: Error finding container 079c5f5cbdc845a818a9c1cd5d196a748a66c2fc7fea1bc5ef142869e4b0db23: Status 404 returned error can't find the container with id 079c5f5cbdc845a818a9c1cd5d196a748a66c2fc7fea1bc5ef142869e4b0db23 Apr 22 19:23:20.527763 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:20.527671 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cfa1b6f_8796_4d69_9c36_cd2bfdc2280d.slice/crio-7424465b20ec27261964816990b383e0d95d666c6697de36c60a9e387babc3d0 WatchSource:0}: Error finding container 7424465b20ec27261964816990b383e0d95d666c6697de36c60a9e387babc3d0: Status 404 returned error can't find the container with id 7424465b20ec27261964816990b383e0d95d666c6697de36c60a9e387babc3d0 Apr 22 19:23:20.528858 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:20.528833 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a484ef5_ac14_4ff2_ab99_82238200be07.slice/crio-464818265d8733450d3e1e5101c5fef4814177fb2d387c2ff0fa00d53f6e8a01 WatchSource:0}: Error finding container 464818265d8733450d3e1e5101c5fef4814177fb2d387c2ff0fa00d53f6e8a01: Status 404 returned error can't find the container with id 464818265d8733450d3e1e5101c5fef4814177fb2d387c2ff0fa00d53f6e8a01 Apr 22 19:23:20.529928 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:20.529904 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod479bcdb7_ebbf_4317_9949_e55ece55ec17.slice/crio-6ee6b7e61b30e649d316089b11726eeba09fa9b553474532e184798d8635e243 WatchSource:0}: Error finding container 6ee6b7e61b30e649d316089b11726eeba09fa9b553474532e184798d8635e243: Status 404 returned error can't find the container with id 6ee6b7e61b30e649d316089b11726eeba09fa9b553474532e184798d8635e243 Apr 22 19:23:20.530547 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:20.530515 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54216124_3633_4740_9592_06c935cb0781.slice/crio-590ccde34343e224372f7b6d6130f350e57e63059e59e16aa9ee95c2b8b59432 WatchSource:0}: Error finding container 590ccde34343e224372f7b6d6130f350e57e63059e59e16aa9ee95c2b8b59432: Status 404 returned error can't find the container with id 590ccde34343e224372f7b6d6130f350e57e63059e59e16aa9ee95c2b8b59432 Apr 22 19:23:20.531903 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:20.531875 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode451824a_2133_4364_b91f_8b08929198a3.slice/crio-03d7fd3a2a2372e22e3c93a0782d24e72e55025341a9e2f3485082cd265d80dc WatchSource:0}: Error finding container 03d7fd3a2a2372e22e3c93a0782d24e72e55025341a9e2f3485082cd265d80dc: Status 404 returned error can't find the container with id 03d7fd3a2a2372e22e3c93a0782d24e72e55025341a9e2f3485082cd265d80dc Apr 22 19:23:20.533313 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:20.533214 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76a63f32_8306_496d_ab47_f0ec1293937f.slice/crio-2ab35829ad99a269809b70a4f07ec85bf98297e5678c4df59c46794efca9ea62 WatchSource:0}: Error finding container 2ab35829ad99a269809b70a4f07ec85bf98297e5678c4df59c46794efca9ea62: Status 404 returned error can't find the container with id 2ab35829ad99a269809b70a4f07ec85bf98297e5678c4df59c46794efca9ea62 Apr 22 19:23:20.700923 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.700878 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:18 +0000 UTC" deadline="2028-01-28 05:36:34.366437062 +0000 UTC" Apr 22 19:23:20.700923 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.700918 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15490h13m13.665522282s" Apr 22 19:23:20.811567 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.811452 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-159.ec2.internal" event={"ID":"b82df689b1e19e43cfff5d00b46485d2","Type":"ContainerStarted","Data":"bcd3b8af8344bff12be5ba2e407a67bb6c7ef2d71925d809ee5a360c7ac6f1c6"} Apr 22 19:23:20.812669 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.812640 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tjgwl" event={"ID":"e451824a-2133-4364-b91f-8b08929198a3","Type":"ContainerStarted","Data":"03d7fd3a2a2372e22e3c93a0782d24e72e55025341a9e2f3485082cd265d80dc"} Apr 22 19:23:20.813574 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.813555 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" event={"ID":"479bcdb7-ebbf-4317-9949-e55ece55ec17","Type":"ContainerStarted","Data":"6ee6b7e61b30e649d316089b11726eeba09fa9b553474532e184798d8635e243"} Apr 22 19:23:20.814490 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.814470 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzp7f" event={"ID":"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d","Type":"ContainerStarted","Data":"7424465b20ec27261964816990b383e0d95d666c6697de36c60a9e387babc3d0"} Apr 22 19:23:20.815483 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.815402 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" event={"ID":"468346a2-9e30-4adf-916d-475adc66b11c","Type":"ContainerStarted","Data":"079c5f5cbdc845a818a9c1cd5d196a748a66c2fc7fea1bc5ef142869e4b0db23"} Apr 22 19:23:20.816420 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.816398 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5bh88" event={"ID":"73366f6c-dc1e-4c5b-a1f3-e3d7839a351e","Type":"ContainerStarted","Data":"a38eed59db7a6bd0ec7c8c9b0d4d30d3fcedf364c3264d2b69dec7d221cf1444"} Apr 22 19:23:20.817319 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.817300 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rpmcm" event={"ID":"76a63f32-8306-496d-ab47-f0ec1293937f","Type":"ContainerStarted","Data":"2ab35829ad99a269809b70a4f07ec85bf98297e5678c4df59c46794efca9ea62"} Apr 22 19:23:20.819463 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.819437 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-drfzw" event={"ID":"54216124-3633-4740-9592-06c935cb0781","Type":"ContainerStarted","Data":"590ccde34343e224372f7b6d6130f350e57e63059e59e16aa9ee95c2b8b59432"} Apr 22 19:23:20.820478 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.820453 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" event={"ID":"9a484ef5-ac14-4ff2-ab99-82238200be07","Type":"ContainerStarted","Data":"464818265d8733450d3e1e5101c5fef4814177fb2d387c2ff0fa00d53f6e8a01"} Apr 22 19:23:20.822573 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.822551 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t8zwd" event={"ID":"04dcf06d-ab97-49fd-b8b0-d5036c249ae1","Type":"ContainerStarted","Data":"472387ee3b3aaf47c603e3d237fb36d613f07de4123f9d0a3a4a2a50c5b3f40d"} Apr 22 19:23:20.828443 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:20.828397 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-159.ec2.internal" podStartSLOduration=1.828381057 podStartE2EDuration="1.828381057s" podCreationTimestamp="2026-04-22 19:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:20.827537129 +0000 UTC m=+3.632378432" watchObservedRunningTime="2026-04-22 19:23:20.828381057 +0000 UTC m=+3.633222363" Apr 22 19:23:21.296598 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:21.296556 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:21.296779 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:21.296608 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdx4h\" (UniqueName: \"kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h\") pod \"network-check-target-xkr7v\" (UID: \"5037e972-6e10-4b21-bde5-a072bf744013\") " pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:21.296779 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:21.296770 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:21.296882 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:21.296788 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:21.296882 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:21.296801 2579 projected.go:194] Error preparing data for projected volume kube-api-access-gdx4h for pod openshift-network-diagnostics/network-check-target-xkr7v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:21.296882 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:21.296853 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h podName:5037e972-6e10-4b21-bde5-a072bf744013 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:23.296835343 +0000 UTC m=+6.101676632 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gdx4h" (UniqueName: "kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h") pod "network-check-target-xkr7v" (UID: "5037e972-6e10-4b21-bde5-a072bf744013") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:21.297041 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:21.296917 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:21.297041 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:21.296980 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs podName:9c0f6922-8799-4caa-adfb-fa958fee9291 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:23.296967757 +0000 UTC m=+6.101809043 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs") pod "network-metrics-daemon-qwbg8" (UID: "9c0f6922-8799-4caa-adfb-fa958fee9291") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:21.704425 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:21.704342 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:21.803783 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:21.803750 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:21.803942 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:21.803885 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:21.804334 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:21.804304 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:21.804470 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:21.804417 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:21.836377 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:21.836338 2579 generic.go:358] "Generic (PLEG): container finished" podID="842ff22705529194224b4584003f8941" containerID="80d0d60353d65a5001c76f58206ae80ea4bef50595d08a913511fcde6c8adaad" exitCode=0 Apr 22 19:23:21.837480 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:21.837431 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal" event={"ID":"842ff22705529194224b4584003f8941","Type":"ContainerDied","Data":"80d0d60353d65a5001c76f58206ae80ea4bef50595d08a913511fcde6c8adaad"} Apr 22 19:23:22.844755 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:22.844633 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal" event={"ID":"842ff22705529194224b4584003f8941","Type":"ContainerStarted","Data":"7f104853fb61084b93f9cac9df5e62f40bf37d607cf37a8b026a72fbd9c84f23"} Apr 22 19:23:23.313183 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:23.313144 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:23.313373 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:23.313196 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdx4h\" (UniqueName: \"kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h\") pod \"network-check-target-xkr7v\" (UID: \"5037e972-6e10-4b21-bde5-a072bf744013\") " pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:23.313373 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:23.313321 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:23.313373 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:23.313338 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:23.313373 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:23.313350 2579 projected.go:194] Error preparing data for projected volume kube-api-access-gdx4h for pod openshift-network-diagnostics/network-check-target-xkr7v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:23.313566 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:23.313407 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h podName:5037e972-6e10-4b21-bde5-a072bf744013 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:27.313388612 +0000 UTC m=+10.118229894 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gdx4h" (UniqueName: "kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h") pod "network-check-target-xkr7v" (UID: "5037e972-6e10-4b21-bde5-a072bf744013") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:23.313867 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:23.313845 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:23.313977 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:23.313903 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs podName:9c0f6922-8799-4caa-adfb-fa958fee9291 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:27.313887017 +0000 UTC m=+10.118728300 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs") pod "network-metrics-daemon-qwbg8" (UID: "9c0f6922-8799-4caa-adfb-fa958fee9291") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:23.801673 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:23.801629 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:23.801872 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:23.801790 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:23.802235 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:23.802209 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:23.802402 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:23.802376 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:25.801639 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:25.801600 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:25.802047 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:25.801800 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:25.802047 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:25.801807 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:25.802047 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:25.801909 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:27.346274 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:27.346236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:27.346794 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:27.346298 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdx4h\" (UniqueName: \"kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h\") pod \"network-check-target-xkr7v\" (UID: \"5037e972-6e10-4b21-bde5-a072bf744013\") " pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:27.346794 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:27.346418 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:27.346794 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:27.346449 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:27.346794 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:27.346467 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:27.346794 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:27.346481 2579 projected.go:194] Error preparing data for projected volume kube-api-access-gdx4h for pod openshift-network-diagnostics/network-check-target-xkr7v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:27.346794 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:27.346505 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs podName:9c0f6922-8799-4caa-adfb-fa958fee9291 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:35.346484674 +0000 UTC m=+18.151325961 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs") pod "network-metrics-daemon-qwbg8" (UID: "9c0f6922-8799-4caa-adfb-fa958fee9291") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:27.346794 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:27.346524 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h podName:5037e972-6e10-4b21-bde5-a072bf744013 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:35.346512677 +0000 UTC m=+18.151353959 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gdx4h" (UniqueName: "kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h") pod "network-check-target-xkr7v" (UID: "5037e972-6e10-4b21-bde5-a072bf744013") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:27.806176 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:27.803581 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:27.806176 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:27.803753 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:27.806176 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:27.804185 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:27.806176 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:27.804302 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:29.801742 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:29.801688 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:29.802201 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:29.801705 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:29.802201 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:29.801846 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:29.802201 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:29.801935 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:31.801194 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:31.801155 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:31.801757 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:31.801196 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:31.801757 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:31.801301 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:31.801757 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:31.801440 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:32.035640 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:32.035543 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-159.ec2.internal" podStartSLOduration=13.035526981 podStartE2EDuration="13.035526981s" podCreationTimestamp="2026-04-22 19:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:22.861841841 +0000 UTC m=+5.666683145" watchObservedRunningTime="2026-04-22 19:23:32.035526981 +0000 UTC m=+14.840368284" Apr 22 19:23:32.036266 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:32.036240 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-kjl96"] Apr 22 19:23:32.039313 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:32.039285 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:32.039412 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:32.039365 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjl96" podUID="ffc97da9-d997-4214-bf06-cbdb4a551c74" Apr 22 19:23:32.082075 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:32.081980 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:32.082233 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:32.082092 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ffc97da9-d997-4214-bf06-cbdb4a551c74-kubelet-config\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:32.082233 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:32.082114 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ffc97da9-d997-4214-bf06-cbdb4a551c74-dbus\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:32.182715 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:32.182669 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:32.182886 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:32.182846 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:32.182952 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:32.182924 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret podName:ffc97da9-d997-4214-bf06-cbdb4a551c74 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:32.682907809 +0000 UTC m=+15.487749091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret") pod "global-pull-secret-syncer-kjl96" (UID: "ffc97da9-d997-4214-bf06-cbdb4a551c74") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:32.182952 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:32.182945 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ffc97da9-d997-4214-bf06-cbdb4a551c74-kubelet-config\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:32.183069 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:32.182975 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ffc97da9-d997-4214-bf06-cbdb4a551c74-dbus\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:32.183069 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:32.183050 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ffc97da9-d997-4214-bf06-cbdb4a551c74-kubelet-config\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:32.183150 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:32.183138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ffc97da9-d997-4214-bf06-cbdb4a551c74-dbus\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:32.686831 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:32.686784 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:32.687011 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:32.686959 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:32.687072 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:32.687046 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret podName:ffc97da9-d997-4214-bf06-cbdb4a551c74 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:33.687024312 +0000 UTC m=+16.491865597 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret") pod "global-pull-secret-syncer-kjl96" (UID: "ffc97da9-d997-4214-bf06-cbdb4a551c74") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:33.693959 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:33.693919 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:33.694438 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:33.694100 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:33.694438 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:33.694180 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret podName:ffc97da9-d997-4214-bf06-cbdb4a551c74 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:35.694159527 +0000 UTC m=+18.499000822 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret") pod "global-pull-secret-syncer-kjl96" (UID: "ffc97da9-d997-4214-bf06-cbdb4a551c74") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:33.801332 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:33.801294 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:33.801332 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:33.801329 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:33.801555 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:33.801299 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:33.801555 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:33.801426 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjl96" podUID="ffc97da9-d997-4214-bf06-cbdb4a551c74" Apr 22 19:23:33.801555 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:33.801518 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:33.801652 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:33.801619 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:35.409222 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:35.409183 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:35.409659 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:35.409232 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdx4h\" (UniqueName: \"kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h\") pod \"network-check-target-xkr7v\" (UID: \"5037e972-6e10-4b21-bde5-a072bf744013\") " pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:35.409659 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:35.409344 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:35.409659 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:35.409354 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:35.409659 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:35.409370 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:35.409659 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:35.409382 2579 projected.go:194] Error preparing data for projected volume kube-api-access-gdx4h for pod openshift-network-diagnostics/network-check-target-xkr7v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:35.409659 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:35.409430 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs podName:9c0f6922-8799-4caa-adfb-fa958fee9291 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:51.409409816 +0000 UTC m=+34.214251104 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs") pod "network-metrics-daemon-qwbg8" (UID: "9c0f6922-8799-4caa-adfb-fa958fee9291") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:35.409659 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:35.409444 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h podName:5037e972-6e10-4b21-bde5-a072bf744013 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:51.409437995 +0000 UTC m=+34.214279277 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gdx4h" (UniqueName: "kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h") pod "network-check-target-xkr7v" (UID: "5037e972-6e10-4b21-bde5-a072bf744013") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:35.711997 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:35.711899 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:35.712142 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:35.712037 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:35.712142 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:35.712105 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret podName:ffc97da9-d997-4214-bf06-cbdb4a551c74 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:39.712090199 +0000 UTC m=+22.516931481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret") pod "global-pull-secret-syncer-kjl96" (UID: "ffc97da9-d997-4214-bf06-cbdb4a551c74") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:35.801682 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:35.801646 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:35.801872 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:35.801782 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:35.801872 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:35.801795 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:35.801976 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:35.801926 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjl96" podUID="ffc97da9-d997-4214-bf06-cbdb4a551c74" Apr 22 19:23:35.801976 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:35.801958 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:35.802088 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:35.802065 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:37.801789 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.801594 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:37.802251 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.801653 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:37.802251 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:37.801882 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjl96" podUID="ffc97da9-d997-4214-bf06-cbdb4a551c74" Apr 22 19:23:37.802251 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:37.801938 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:37.802251 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.801674 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:37.802251 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:37.802022 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:37.869604 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.869568 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" event={"ID":"479bcdb7-ebbf-4317-9949-e55ece55ec17","Type":"ContainerStarted","Data":"e124298cc15fa43cc72fa6729b032af7b3315370c2d5ae3640c614c4e7bed23f"} Apr 22 19:23:37.870763 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.870723 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzp7f" event={"ID":"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d","Type":"ContainerStarted","Data":"feb1466f0f2b761e9a9ee30dcff253f3028e7d4c522ad5750394f61a9a056c80"} Apr 22 19:23:37.872055 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.872025 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" event={"ID":"468346a2-9e30-4adf-916d-475adc66b11c","Type":"ContainerStarted","Data":"f47712c96a214f2c8e9034774ec28362dc5d1fe5dc9ed8ed8fb723d2cd3c57cb"} Apr 22 19:23:37.873302 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.873280 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5bh88" event={"ID":"73366f6c-dc1e-4c5b-a1f3-e3d7839a351e","Type":"ContainerStarted","Data":"04a015b1319ab5182b0ba022b94d434953840b9a0b1a1cdadb96a84394441f40"} Apr 22 19:23:37.874515 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.874495 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rpmcm" event={"ID":"76a63f32-8306-496d-ab47-f0ec1293937f","Type":"ContainerStarted","Data":"46c6d9afc9fd3d4ba6eb6429bdec9d7db47c247f260308f8edf1ba5c37c927cd"} Apr 22 19:23:37.875685 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.875666 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-drfzw" event={"ID":"54216124-3633-4740-9592-06c935cb0781","Type":"ContainerStarted","Data":"4fdd3e17b66567aad71889d6dc5ee6086bb746a89f4056522d3cf11592f375db"} Apr 22 19:23:37.877435 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.877416 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" event={"ID":"9a484ef5-ac14-4ff2-ab99-82238200be07","Type":"ContainerStarted","Data":"1d0bc01f1f049c2d6268796fdf0a24754d5019f3c09c3dd13fb0bc6ea0a59d8c"} Apr 22 19:23:37.877506 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.877439 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" event={"ID":"9a484ef5-ac14-4ff2-ab99-82238200be07","Type":"ContainerStarted","Data":"b3b272278a2326596b164f493d8beabb674f02c7044eef41a9b88b8ddbb6ed26"} Apr 22 19:23:37.877506 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.877453 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" event={"ID":"9a484ef5-ac14-4ff2-ab99-82238200be07","Type":"ContainerStarted","Data":"f877987641e90014c971000f033bd9679dc4ec53beb11a0e97be8d94e4e36c61"} Apr 22 19:23:37.878798 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.878755 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t8zwd" event={"ID":"04dcf06d-ab97-49fd-b8b0-d5036c249ae1","Type":"ContainerStarted","Data":"082ba181137ad699c1587bb4091169fa59439058201f757748d8cc24ddfe5e65"} Apr 22 19:23:37.911498 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.911456 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-drfzw" podStartSLOduration=11.828114352 podStartE2EDuration="20.911442181s" podCreationTimestamp="2026-04-22 19:23:17 +0000 UTC" firstStartedPulling="2026-04-22 19:23:20.533024067 +0000 UTC m=+3.337865355" lastFinishedPulling="2026-04-22 19:23:29.6163519 +0000 UTC m=+12.421193184" observedRunningTime="2026-04-22 19:23:37.911122282 +0000 UTC m=+20.715963587" watchObservedRunningTime="2026-04-22 19:23:37.911442181 +0000 UTC m=+20.716283548" Apr 22 19:23:37.931554 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.931510 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4ntkh" podStartSLOduration=4.106792918 podStartE2EDuration="20.931496483s" podCreationTimestamp="2026-04-22 19:23:17 +0000 UTC" firstStartedPulling="2026-04-22 19:23:20.526400956 +0000 UTC m=+3.331242238" lastFinishedPulling="2026-04-22 19:23:37.351104502 +0000 UTC m=+20.155945803" observedRunningTime="2026-04-22 19:23:37.931327242 +0000 UTC m=+20.736168546" watchObservedRunningTime="2026-04-22 19:23:37.931496483 +0000 UTC m=+20.736337787" Apr 22 19:23:37.948128 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.948078 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-t8zwd" podStartSLOduration=3.870185588 podStartE2EDuration="20.948063337s" podCreationTimestamp="2026-04-22 19:23:17 +0000 UTC" firstStartedPulling="2026-04-22 19:23:20.522081332 +0000 UTC m=+3.326922614" lastFinishedPulling="2026-04-22 19:23:37.599959071 +0000 UTC m=+20.404800363" observedRunningTime="2026-04-22 19:23:37.947963692 +0000 UTC m=+20.752805019" watchObservedRunningTime="2026-04-22 19:23:37.948063337 +0000 UTC m=+20.752904641" Apr 22 19:23:37.961621 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.961561 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rpmcm" podStartSLOduration=4.185819371 podStartE2EDuration="20.961539755s" podCreationTimestamp="2026-04-22 19:23:17 +0000 UTC" firstStartedPulling="2026-04-22 19:23:20.536474349 +0000 UTC m=+3.341315631" lastFinishedPulling="2026-04-22 19:23:37.312194724 +0000 UTC m=+20.117036015" observedRunningTime="2026-04-22 19:23:37.961250393 +0000 UTC m=+20.766091696" watchObservedRunningTime="2026-04-22 19:23:37.961539755 +0000 UTC m=+20.766381060" Apr 22 19:23:37.976345 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:37.976296 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5bh88" podStartSLOduration=4.188237062 podStartE2EDuration="20.976278235s" podCreationTimestamp="2026-04-22 19:23:17 +0000 UTC" firstStartedPulling="2026-04-22 19:23:20.524152512 +0000 UTC m=+3.328993796" lastFinishedPulling="2026-04-22 19:23:37.312193675 +0000 UTC m=+20.117034969" observedRunningTime="2026-04-22 19:23:37.976117659 +0000 UTC m=+20.780958972" watchObservedRunningTime="2026-04-22 19:23:37.976278235 +0000 UTC m=+20.781119536" Apr 22 19:23:38.883532 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:38.883353 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:23:38.884210 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:38.883787 2579 generic.go:358] "Generic (PLEG): container finished" podID="9a484ef5-ac14-4ff2-ab99-82238200be07" containerID="b3b272278a2326596b164f493d8beabb674f02c7044eef41a9b88b8ddbb6ed26" exitCode=1 Apr 22 19:23:38.884210 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:38.883856 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" event={"ID":"9a484ef5-ac14-4ff2-ab99-82238200be07","Type":"ContainerDied","Data":"b3b272278a2326596b164f493d8beabb674f02c7044eef41a9b88b8ddbb6ed26"} Apr 22 19:23:38.884210 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:38.883880 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" event={"ID":"9a484ef5-ac14-4ff2-ab99-82238200be07","Type":"ContainerStarted","Data":"879d0e5386a03ff3827d69cb58785e323ac55ced770f93fcb3f21fdaf6b98fe0"} Apr 22 19:23:38.884210 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:38.883890 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" event={"ID":"9a484ef5-ac14-4ff2-ab99-82238200be07","Type":"ContainerStarted","Data":"dbc3b26be87d62fac4784fb9a23b31b0665de3d964f56689f3a9d1e27d751b45"} Apr 22 19:23:38.884210 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:38.883902 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" event={"ID":"9a484ef5-ac14-4ff2-ab99-82238200be07","Type":"ContainerStarted","Data":"35b2bef83a3f48bc03ceb50e16bff592dc0705d659225a11e2660ae63393908b"} Apr 22 19:23:38.884980 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:38.884953 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tjgwl" event={"ID":"e451824a-2133-4364-b91f-8b08929198a3","Type":"ContainerStarted","Data":"190607354c046051da42f263d5843c0dddc2fe4cfce98032caa8c8bc8a6728f8"} Apr 22 19:23:38.886897 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:38.886814 2579 generic.go:358] "Generic (PLEG): container finished" podID="1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d" containerID="feb1466f0f2b761e9a9ee30dcff253f3028e7d4c522ad5750394f61a9a056c80" exitCode=0 Apr 22 19:23:38.887030 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:38.886955 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzp7f" event={"ID":"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d","Type":"ContainerDied","Data":"feb1466f0f2b761e9a9ee30dcff253f3028e7d4c522ad5750394f61a9a056c80"} Apr 22 19:23:38.907372 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:38.907332 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tjgwl" podStartSLOduration=5.129863542 podStartE2EDuration="21.907317133s" podCreationTimestamp="2026-04-22 19:23:17 +0000 UTC" firstStartedPulling="2026-04-22 19:23:20.534790595 +0000 UTC m=+3.339631890" lastFinishedPulling="2026-04-22 19:23:37.312244185 +0000 UTC m=+20.117085481" observedRunningTime="2026-04-22 19:23:38.907216381 +0000 UTC m=+21.712057685" watchObservedRunningTime="2026-04-22 19:23:38.907317133 +0000 UTC m=+21.712158437" Apr 22 19:23:39.148306 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:39.148273 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:23:39.171040 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:39.171014 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rpmcm" Apr 22 19:23:39.171637 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:39.171616 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rpmcm" Apr 22 19:23:39.736859 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:39.736743 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:23:39.148300228Z","UUID":"9491a20e-aa05-4056-9d38-eda5861fb92b","Handler":null,"Name":"","Endpoint":""} Apr 22 19:23:39.739157 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:39.739133 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:23:39.739278 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:39.739166 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:23:39.741659 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:39.741631 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:39.741870 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:39.741851 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:39.741984 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:39.741966 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret podName:ffc97da9-d997-4214-bf06-cbdb4a551c74 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:47.741904422 +0000 UTC m=+30.546745720 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret") pod "global-pull-secret-syncer-kjl96" (UID: "ffc97da9-d997-4214-bf06-cbdb4a551c74") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:39.801395 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:39.801355 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:39.801557 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:39.801492 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:39.801557 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:39.801356 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:39.801676 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:39.801589 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjl96" podUID="ffc97da9-d997-4214-bf06-cbdb4a551c74" Apr 22 19:23:39.801676 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:39.801355 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:39.801799 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:39.801678 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:39.891234 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:39.891196 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" event={"ID":"479bcdb7-ebbf-4317-9949-e55ece55ec17","Type":"ContainerStarted","Data":"77a12b9b669b03c956fb68e1936a3a2cb2459db0de01514015832364dea62576"} Apr 22 19:23:39.891715 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:39.891505 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rpmcm" Apr 22 19:23:39.892011 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:39.891990 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rpmcm" Apr 22 19:23:40.895959 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:40.895685 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:23:40.896554 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:40.896524 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" event={"ID":"9a484ef5-ac14-4ff2-ab99-82238200be07","Type":"ContainerStarted","Data":"f347631c41f707519ec941e3917c438ef99345c862e9f99c2698e5ff7014f7b5"} Apr 22 19:23:40.898525 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:40.898495 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" event={"ID":"479bcdb7-ebbf-4317-9949-e55ece55ec17","Type":"ContainerStarted","Data":"ac6aeb48a5dc8f16c54f146fa04b6838fa5ba82d36bc82f59c8388ab343d7a44"} Apr 22 19:23:41.801426 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:41.801392 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:41.801654 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:41.801515 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:41.801654 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:41.801519 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:41.801654 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:41.801633 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjl96" podUID="ffc97da9-d997-4214-bf06-cbdb4a551c74" Apr 22 19:23:41.801834 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:41.801685 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:41.801834 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:41.801777 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:42.908127 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:42.908106 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:23:43.801876 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:43.801674 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:43.802047 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:43.801683 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:43.802047 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:43.801939 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjl96" podUID="ffc97da9-d997-4214-bf06-cbdb4a551c74" Apr 22 19:23:43.802047 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:43.801715 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:43.802047 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:43.802025 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:43.802222 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:43.802112 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:43.912402 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:43.912370 2579 generic.go:358] "Generic (PLEG): container finished" podID="1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d" containerID="267659744a00eaa6d796b6b63f8a7ccbc19aa1d39bf6a01e0fe8b53859af688a" exitCode=0 Apr 22 19:23:43.912847 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:43.912458 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzp7f" event={"ID":"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d","Type":"ContainerDied","Data":"267659744a00eaa6d796b6b63f8a7ccbc19aa1d39bf6a01e0fe8b53859af688a"} Apr 22 19:23:43.915593 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:43.915520 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:23:43.915864 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:43.915838 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" event={"ID":"9a484ef5-ac14-4ff2-ab99-82238200be07","Type":"ContainerStarted","Data":"4fbff1b3337d2444401b32ac20b50632834273273570c14bb85142edad133ea4"} Apr 22 19:23:43.916104 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:43.916087 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:43.916165 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:43.916115 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:43.916310 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:43.916292 2579 scope.go:117] "RemoveContainer" containerID="b3b272278a2326596b164f493d8beabb674f02c7044eef41a9b88b8ddbb6ed26" Apr 22 19:23:43.930958 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:43.930936 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:43.945571 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:43.945531 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vmzwc" podStartSLOduration=7.069016368 podStartE2EDuration="26.945514718s" podCreationTimestamp="2026-04-22 19:23:17 +0000 UTC" firstStartedPulling="2026-04-22 19:23:20.533076138 +0000 UTC m=+3.337917422" lastFinishedPulling="2026-04-22 19:23:40.409574484 +0000 UTC m=+23.214415772" observedRunningTime="2026-04-22 19:23:40.919426772 +0000 UTC m=+23.724268078" watchObservedRunningTime="2026-04-22 19:23:43.945514718 +0000 UTC m=+26.750356004" Apr 22 19:23:44.856767 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:44.856718 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qwbg8"] Apr 22 19:23:44.856958 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:44.856874 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:44.857012 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:44.856991 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:44.860175 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:44.859925 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xkr7v"] Apr 22 19:23:44.860175 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:44.860044 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:44.860175 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:44.860138 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:44.860606 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:44.860589 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kjl96"] Apr 22 19:23:44.860693 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:44.860680 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:44.860851 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:44.860787 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjl96" podUID="ffc97da9-d997-4214-bf06-cbdb4a551c74" Apr 22 19:23:44.920267 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:44.920235 2579 generic.go:358] "Generic (PLEG): container finished" podID="1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d" containerID="3936014c88508d687bd6ab95415980f69f91393e956b46e7800ed6b178cf23b7" exitCode=0 Apr 22 19:23:44.920679 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:44.920321 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzp7f" event={"ID":"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d","Type":"ContainerDied","Data":"3936014c88508d687bd6ab95415980f69f91393e956b46e7800ed6b178cf23b7"} Apr 22 19:23:44.924073 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:44.924052 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:23:44.924438 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:44.924416 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" event={"ID":"9a484ef5-ac14-4ff2-ab99-82238200be07","Type":"ContainerStarted","Data":"5e56742b6ce87f4bf8b3c4ebbb8b3d8fa792ae31d4b036d3742de190ab8c6416"} Apr 22 19:23:44.924665 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:44.924651 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:44.939602 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:44.939575 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:23:44.979229 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:44.979171 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" podStartSLOduration=11.092366501 podStartE2EDuration="27.979151705s" podCreationTimestamp="2026-04-22 19:23:17 +0000 UTC" firstStartedPulling="2026-04-22 19:23:20.531693693 +0000 UTC m=+3.336534990" lastFinishedPulling="2026-04-22 19:23:37.418478909 +0000 UTC m=+20.223320194" observedRunningTime="2026-04-22 19:23:44.97731439 +0000 UTC m=+27.782155693" watchObservedRunningTime="2026-04-22 19:23:44.979151705 +0000 UTC m=+27.783993063" Apr 22 19:23:45.928206 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:45.928021 2579 generic.go:358] "Generic (PLEG): container finished" podID="1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d" containerID="9f7f05f2f920a22acae85812fbe56d2f1d0502ca165c5c03011a17414692ec1c" exitCode=0 Apr 22 19:23:45.928577 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:45.928059 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzp7f" event={"ID":"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d","Type":"ContainerDied","Data":"9f7f05f2f920a22acae85812fbe56d2f1d0502ca165c5c03011a17414692ec1c"} Apr 22 19:23:46.801026 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:46.800945 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:46.801250 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:46.801027 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:46.801250 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:46.801070 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:46.801250 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:46.801187 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:46.801415 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:46.801358 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:46.801471 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:46.801456 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjl96" podUID="ffc97da9-d997-4214-bf06-cbdb4a551c74" Apr 22 19:23:47.800017 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:47.799980 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:47.800570 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:47.800130 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:47.800570 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:47.800201 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret podName:ffc97da9-d997-4214-bf06-cbdb4a551c74 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:03.800183003 +0000 UTC m=+46.605024288 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret") pod "global-pull-secret-syncer-kjl96" (UID: "ffc97da9-d997-4214-bf06-cbdb4a551c74") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:23:48.801551 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:48.801504 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:48.801551 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:48.801553 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:48.802170 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:48.801649 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xkr7v" podUID="5037e972-6e10-4b21-bde5-a072bf744013" Apr 22 19:23:48.802170 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:48.801707 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjl96" podUID="ffc97da9-d997-4214-bf06-cbdb4a551c74" Apr 22 19:23:48.802170 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:48.801752 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:48.802170 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:48.801847 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:23:50.487411 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.487332 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-159.ec2.internal" event="NodeReady" Apr 22 19:23:50.488026 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.487498 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:23:50.538297 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.538266 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nrq92"] Apr 22 19:23:50.542606 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.542579 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:50.545033 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.545007 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b42mv"] Apr 22 19:23:50.545770 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.545630 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f8zlb\"" Apr 22 19:23:50.545770 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.545641 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:23:50.545770 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.545630 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:23:50.547877 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.547858 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:23:50.551774 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.551724 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:23:50.552869 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.552567 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:23:50.552869 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.552856 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l8fgp\"" Apr 22 19:23:50.553029 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.553012 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:23:50.553086 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.553043 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nrq92"] Apr 22 19:23:50.563753 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.563708 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b42mv"] Apr 22 19:23:50.623145 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.623107 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:50.623327 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.623171 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-tmp-dir\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:50.623327 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.623201 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st7jm\" (UniqueName: \"kubernetes.io/projected/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-kube-api-access-st7jm\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:50.623327 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.623315 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-config-volume\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:50.623433 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.623345 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwxbw\" (UniqueName: \"kubernetes.io/projected/db70a090-7023-443d-b909-09cc5a489c13-kube-api-access-dwxbw\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:23:50.623433 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.623384 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:23:50.724638 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.724600 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-tmp-dir\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:50.724638 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.724642 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-st7jm\" (UniqueName: \"kubernetes.io/projected/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-kube-api-access-st7jm\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:50.724899 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.724677 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-config-volume\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:50.724899 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.724701 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwxbw\" (UniqueName: \"kubernetes.io/projected/db70a090-7023-443d-b909-09cc5a489c13-kube-api-access-dwxbw\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:23:50.724899 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.724770 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:23:50.724899 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.724829 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:50.725092 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:50.724941 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:50.725092 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:50.725001 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls podName:4cbce8ae-11e1-44fb-a76c-a617c14a01cb nodeName:}" failed. No retries permitted until 2026-04-22 19:23:51.224983057 +0000 UTC m=+34.029824360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls") pod "dns-default-nrq92" (UID: "4cbce8ae-11e1-44fb-a76c-a617c14a01cb") : secret "dns-default-metrics-tls" not found Apr 22 19:23:50.725092 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.725064 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-tmp-dir\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:50.725248 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:50.725104 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:50.725248 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:50.725181 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert podName:db70a090-7023-443d-b909-09cc5a489c13 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:51.225160955 +0000 UTC m=+34.030002378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert") pod "ingress-canary-b42mv" (UID: "db70a090-7023-443d-b909-09cc5a489c13") : secret "canary-serving-cert" not found Apr 22 19:23:50.725525 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.725500 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-config-volume\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:50.738783 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.738689 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-st7jm\" (UniqueName: \"kubernetes.io/projected/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-kube-api-access-st7jm\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:50.739263 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.739237 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwxbw\" (UniqueName: \"kubernetes.io/projected/db70a090-7023-443d-b909-09cc5a489c13-kube-api-access-dwxbw\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:23:50.801349 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.801312 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:50.801542 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.801380 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:50.801542 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.801397 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:23:50.805046 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.804616 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l6zl6\"" Apr 22 19:23:50.805046 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.804663 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:23:50.805046 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.804626 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:23:50.805046 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.804761 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kmhxv\"" Apr 22 19:23:50.805046 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.804908 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:23:50.805046 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:50.804998 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:23:51.229482 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:51.229448 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:23:51.229685 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:51.229522 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:51.229685 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:51.229611 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:51.229820 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:51.229686 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert podName:db70a090-7023-443d-b909-09cc5a489c13 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.229666092 +0000 UTC m=+35.034507379 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert") pod "ingress-canary-b42mv" (UID: "db70a090-7023-443d-b909-09cc5a489c13") : secret "canary-serving-cert" not found Apr 22 19:23:51.229820 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:51.229611 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:51.229820 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:51.229815 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls podName:4cbce8ae-11e1-44fb-a76c-a617c14a01cb nodeName:}" failed. No retries permitted until 2026-04-22 19:23:52.229790946 +0000 UTC m=+35.034632240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls") pod "dns-default-nrq92" (UID: "4cbce8ae-11e1-44fb-a76c-a617c14a01cb") : secret "dns-default-metrics-tls" not found Apr 22 19:23:51.431180 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:51.431142 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:23:51.431180 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:51.431182 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdx4h\" (UniqueName: \"kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h\") pod \"network-check-target-xkr7v\" (UID: \"5037e972-6e10-4b21-bde5-a072bf744013\") " pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:51.431398 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:51.431289 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:23:51.431398 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:51.431355 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs podName:9c0f6922-8799-4caa-adfb-fa958fee9291 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:23.431338584 +0000 UTC m=+66.236179866 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs") pod "network-metrics-daemon-qwbg8" (UID: "9c0f6922-8799-4caa-adfb-fa958fee9291") : secret "metrics-daemon-secret" not found Apr 22 19:23:51.433682 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:51.433652 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdx4h\" (UniqueName: \"kubernetes.io/projected/5037e972-6e10-4b21-bde5-a072bf744013-kube-api-access-gdx4h\") pod \"network-check-target-xkr7v\" (UID: \"5037e972-6e10-4b21-bde5-a072bf744013\") " pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:51.720341 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:51.720307 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:51.990985 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:51.990805 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xkr7v"] Apr 22 19:23:51.994893 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:23:51.994866 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5037e972_6e10_4b21_bde5_a072bf744013.slice/crio-7d66093ffeb965166deb4464fb64217d273130c25ae6ea7c1f7c18177af3d5c9 WatchSource:0}: Error finding container 7d66093ffeb965166deb4464fb64217d273130c25ae6ea7c1f7c18177af3d5c9: Status 404 returned error can't find the container with id 7d66093ffeb965166deb4464fb64217d273130c25ae6ea7c1f7c18177af3d5c9 Apr 22 19:23:52.236334 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:52.236227 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:52.236334 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:52.236328 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:23:52.236516 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:52.236390 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:52.236516 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:52.236401 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:52.236516 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:52.236464 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls podName:4cbce8ae-11e1-44fb-a76c-a617c14a01cb nodeName:}" failed. No retries permitted until 2026-04-22 19:23:54.236442701 +0000 UTC m=+37.041283988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls") pod "dns-default-nrq92" (UID: "4cbce8ae-11e1-44fb-a76c-a617c14a01cb") : secret "dns-default-metrics-tls" not found Apr 22 19:23:52.236516 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:52.236478 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert podName:db70a090-7023-443d-b909-09cc5a489c13 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:54.236471384 +0000 UTC m=+37.041312666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert") pod "ingress-canary-b42mv" (UID: "db70a090-7023-443d-b909-09cc5a489c13") : secret "canary-serving-cert" not found Apr 22 19:23:52.948107 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:52.948063 2579 generic.go:358] "Generic (PLEG): container finished" podID="1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d" containerID="d2358c959f695186d08a0fdd9ba3dc65aeb7fc9156bd7ffd1dd031d0c55116ab" exitCode=0 Apr 22 19:23:52.948676 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:52.948109 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzp7f" event={"ID":"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d","Type":"ContainerDied","Data":"d2358c959f695186d08a0fdd9ba3dc65aeb7fc9156bd7ffd1dd031d0c55116ab"} Apr 22 19:23:52.949777 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:52.949460 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xkr7v" event={"ID":"5037e972-6e10-4b21-bde5-a072bf744013","Type":"ContainerStarted","Data":"7d66093ffeb965166deb4464fb64217d273130c25ae6ea7c1f7c18177af3d5c9"} Apr 22 19:23:53.955376 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:53.955344 2579 generic.go:358] "Generic (PLEG): container finished" podID="1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d" containerID="f942cae7e8a4dfed015b332f9d10b0c3d0b1bf962fde993311d45cd08f7fd88d" exitCode=0 Apr 22 19:23:53.955805 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:53.955411 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzp7f" event={"ID":"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d","Type":"ContainerDied","Data":"f942cae7e8a4dfed015b332f9d10b0c3d0b1bf962fde993311d45cd08f7fd88d"} Apr 22 19:23:54.252498 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:54.252411 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:23:54.252498 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:54.252471 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:54.252686 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:54.252583 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:54.252686 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:54.252642 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls podName:4cbce8ae-11e1-44fb-a76c-a617c14a01cb nodeName:}" failed. No retries permitted until 2026-04-22 19:23:58.252628329 +0000 UTC m=+41.057469610 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls") pod "dns-default-nrq92" (UID: "4cbce8ae-11e1-44fb-a76c-a617c14a01cb") : secret "dns-default-metrics-tls" not found Apr 22 19:23:54.252686 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:54.252581 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:54.252849 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:54.252752 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert podName:db70a090-7023-443d-b909-09cc5a489c13 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:58.252719883 +0000 UTC m=+41.057561170 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert") pod "ingress-canary-b42mv" (UID: "db70a090-7023-443d-b909-09cc5a489c13") : secret "canary-serving-cert" not found Apr 22 19:23:54.960857 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:54.960639 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzp7f" event={"ID":"1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d","Type":"ContainerStarted","Data":"28ceae51f32a554ffc80beecba4e2e9b08f51e08aea389793370f6c5fdfb7bd6"} Apr 22 19:23:54.987471 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:54.987423 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hzp7f" podStartSLOduration=6.676301001 podStartE2EDuration="37.987406924s" podCreationTimestamp="2026-04-22 19:23:17 +0000 UTC" firstStartedPulling="2026-04-22 19:23:20.5303959 +0000 UTC m=+3.335237184" lastFinishedPulling="2026-04-22 19:23:51.841501822 +0000 UTC m=+34.646343107" observedRunningTime="2026-04-22 19:23:54.985935974 +0000 UTC m=+37.790777278" watchObservedRunningTime="2026-04-22 19:23:54.987406924 +0000 UTC m=+37.792248230" Apr 22 19:23:55.350329 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.350303 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-57dc5fdf5c-sl6kr"] Apr 22 19:23:55.375258 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.375229 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57dc5fdf5c-sl6kr"] Apr 22 19:23:55.375415 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.375352 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.378969 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.378946 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:23:55.379971 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.379723 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:23:55.381367 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.380072 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:23:55.381367 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.380441 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2jtv7\"" Apr 22 19:23:55.387596 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.387576 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:23:55.461978 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.461939 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-image-registry-private-configuration\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.461978 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.461978 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.462191 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.462003 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-ca-trust-extracted\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.462191 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.462105 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-installation-pull-secrets\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.462191 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.462166 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-certificates\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.462281 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.462213 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-trusted-ca\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.462281 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.462275 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-bound-sa-token\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.462343 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.462294 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxz77\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-kube-api-access-cxz77\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.563271 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.563239 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-certificates\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.563418 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.563279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-trusted-ca\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.563418 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.563306 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-bound-sa-token\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.563418 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.563332 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxz77\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-kube-api-access-cxz77\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.563418 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.563363 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-image-registry-private-configuration\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.563418 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.563381 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.563418 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.563413 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-ca-trust-extracted\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.563678 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:55.563655 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:55.563678 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:55.563676 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc5fdf5c-sl6kr: secret "image-registry-tls" not found Apr 22 19:23:55.563783 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:55.563769 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls podName:14a0ecfb-0f63-431b-84a2-7ef9ded759dd nodeName:}" failed. No retries permitted until 2026-04-22 19:23:56.063725018 +0000 UTC m=+38.868566310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls") pod "image-registry-57dc5fdf5c-sl6kr" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd") : secret "image-registry-tls" not found Apr 22 19:23:55.563832 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.563796 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-ca-trust-extracted\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.563880 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.563853 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-installation-pull-secrets\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.565038 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.565014 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-trusted-ca\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.567793 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.567770 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-installation-pull-secrets\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.567793 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.567781 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-image-registry-private-configuration\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.572295 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.572264 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxz77\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-kube-api-access-cxz77\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.572804 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.572782 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-bound-sa-token\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.575706 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.575687 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-certificates\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:55.963976 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.963939 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xkr7v" event={"ID":"5037e972-6e10-4b21-bde5-a072bf744013","Type":"ContainerStarted","Data":"f733536d00ba67ab1c0226b83a087c9c75700cd6af8acba9f57ddf937b9d5e94"} Apr 22 19:23:55.964543 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.964019 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:23:55.980711 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:55.980664 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xkr7v" podStartSLOduration=35.959814335 podStartE2EDuration="38.980649509s" podCreationTimestamp="2026-04-22 19:23:17 +0000 UTC" firstStartedPulling="2026-04-22 19:23:51.996853937 +0000 UTC m=+34.801695218" lastFinishedPulling="2026-04-22 19:23:55.017689106 +0000 UTC m=+37.822530392" observedRunningTime="2026-04-22 19:23:55.979979025 +0000 UTC m=+38.784820331" watchObservedRunningTime="2026-04-22 19:23:55.980649509 +0000 UTC m=+38.785490859" Apr 22 19:23:56.067536 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:56.067495 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:56.067764 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:56.067690 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:56.067764 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:56.067715 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc5fdf5c-sl6kr: secret "image-registry-tls" not found Apr 22 19:23:56.067868 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:56.067806 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls podName:14a0ecfb-0f63-431b-84a2-7ef9ded759dd nodeName:}" failed. No retries permitted until 2026-04-22 19:23:57.067783102 +0000 UTC m=+39.872624389 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls") pod "image-registry-57dc5fdf5c-sl6kr" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd") : secret "image-registry-tls" not found Apr 22 19:23:57.077748 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:57.077701 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:57.078334 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:57.077835 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:57.078334 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:57.077847 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc5fdf5c-sl6kr: secret "image-registry-tls" not found Apr 22 19:23:57.078334 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:57.077934 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls podName:14a0ecfb-0f63-431b-84a2-7ef9ded759dd nodeName:}" failed. No retries permitted until 2026-04-22 19:23:59.077918593 +0000 UTC m=+41.882759880 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls") pod "image-registry-57dc5fdf5c-sl6kr" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd") : secret "image-registry-tls" not found Apr 22 19:23:58.286223 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:58.286180 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:23:58.286619 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:58.286350 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:23:58.286619 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:58.286355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:23:58.286619 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:58.286421 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls podName:4cbce8ae-11e1-44fb-a76c-a617c14a01cb nodeName:}" failed. No retries permitted until 2026-04-22 19:24:06.286405865 +0000 UTC m=+49.091247151 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls") pod "dns-default-nrq92" (UID: "4cbce8ae-11e1-44fb-a76c-a617c14a01cb") : secret "dns-default-metrics-tls" not found Apr 22 19:23:58.286619 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:58.286423 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:23:58.286619 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:58.286487 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert podName:db70a090-7023-443d-b909-09cc5a489c13 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:06.286471813 +0000 UTC m=+49.091313097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert") pod "ingress-canary-b42mv" (UID: "db70a090-7023-443d-b909-09cc5a489c13") : secret "canary-serving-cert" not found Apr 22 19:23:59.091471 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:23:59.091367 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:23:59.091615 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:59.091514 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:23:59.091615 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:59.091529 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc5fdf5c-sl6kr: secret "image-registry-tls" not found Apr 22 19:23:59.091615 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:23:59.091586 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls podName:14a0ecfb-0f63-431b-84a2-7ef9ded759dd nodeName:}" failed. No retries permitted until 2026-04-22 19:24:03.091569695 +0000 UTC m=+45.896411001 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls") pod "image-registry-57dc5fdf5c-sl6kr" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd") : secret "image-registry-tls" not found Apr 22 19:24:03.122416 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:03.122365 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:24:03.122867 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:03.122491 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:03.122867 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:03.122510 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc5fdf5c-sl6kr: secret "image-registry-tls" not found Apr 22 19:24:03.122867 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:03.122561 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls podName:14a0ecfb-0f63-431b-84a2-7ef9ded759dd nodeName:}" failed. No retries permitted until 2026-04-22 19:24:11.12254758 +0000 UTC m=+53.927388861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls") pod "image-registry-57dc5fdf5c-sl6kr" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd") : secret "image-registry-tls" not found Apr 22 19:24:03.828349 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:03.828303 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:24:03.830755 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:03.830713 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ffc97da9-d997-4214-bf06-cbdb4a551c74-original-pull-secret\") pod \"global-pull-secret-syncer-kjl96\" (UID: \"ffc97da9-d997-4214-bf06-cbdb4a551c74\") " pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:24:04.026988 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:04.026946 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjl96" Apr 22 19:24:04.160517 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:04.160481 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kjl96"] Apr 22 19:24:04.983874 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:04.983842 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kjl96" event={"ID":"ffc97da9-d997-4214-bf06-cbdb4a551c74","Type":"ContainerStarted","Data":"2fce4ac66130407814f471e3e515b578cfcf8c997bb3d85869ddd53022803923"} Apr 22 19:24:06.351258 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:06.351225 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:24:06.351780 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:06.351313 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:24:06.351780 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:06.351375 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:06.351780 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:06.351436 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:06.351780 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:06.351440 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls podName:4cbce8ae-11e1-44fb-a76c-a617c14a01cb nodeName:}" failed. No retries permitted until 2026-04-22 19:24:22.351425625 +0000 UTC m=+65.156266912 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls") pod "dns-default-nrq92" (UID: "4cbce8ae-11e1-44fb-a76c-a617c14a01cb") : secret "dns-default-metrics-tls" not found Apr 22 19:24:06.351780 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:06.351508 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert podName:db70a090-7023-443d-b909-09cc5a489c13 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:22.351489683 +0000 UTC m=+65.156330972 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert") pod "ingress-canary-b42mv" (UID: "db70a090-7023-443d-b909-09cc5a489c13") : secret "canary-serving-cert" not found Apr 22 19:24:08.995559 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:08.995522 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kjl96" event={"ID":"ffc97da9-d997-4214-bf06-cbdb4a551c74","Type":"ContainerStarted","Data":"63530b51bce6c7b62bb05f4e1e335c8955d595df29b6fcf6386996e963bf13b4"} Apr 22 19:24:09.016358 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:09.016303 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kjl96" podStartSLOduration=33.118951648 podStartE2EDuration="37.016287654s" podCreationTimestamp="2026-04-22 19:23:32 +0000 UTC" firstStartedPulling="2026-04-22 19:24:04.165189514 +0000 UTC m=+46.970030796" lastFinishedPulling="2026-04-22 19:24:08.062525514 +0000 UTC m=+50.867366802" observedRunningTime="2026-04-22 19:24:09.016110925 +0000 UTC m=+51.820952241" watchObservedRunningTime="2026-04-22 19:24:09.016287654 +0000 UTC m=+51.821128958" Apr 22 19:24:11.187596 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:11.187564 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:24:11.188012 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:11.187714 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:11.188012 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:11.187746 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc5fdf5c-sl6kr: secret "image-registry-tls" not found Apr 22 19:24:11.188012 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:11.187809 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls podName:14a0ecfb-0f63-431b-84a2-7ef9ded759dd nodeName:}" failed. No retries permitted until 2026-04-22 19:24:27.187792909 +0000 UTC m=+69.992634191 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls") pod "image-registry-57dc5fdf5c-sl6kr" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd") : secret "image-registry-tls" not found Apr 22 19:24:16.949414 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:16.949384 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2crp2" Apr 22 19:24:22.364535 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:22.364499 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:24:22.365069 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:22.364553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:24:22.365069 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:22.364642 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:22.365069 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:22.364694 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls podName:4cbce8ae-11e1-44fb-a76c-a617c14a01cb nodeName:}" failed. No retries permitted until 2026-04-22 19:24:54.364680216 +0000 UTC m=+97.169521498 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls") pod "dns-default-nrq92" (UID: "4cbce8ae-11e1-44fb-a76c-a617c14a01cb") : secret "dns-default-metrics-tls" not found Apr 22 19:24:22.365178 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:22.365066 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:22.365178 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:22.365119 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert podName:db70a090-7023-443d-b909-09cc5a489c13 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:54.365105361 +0000 UTC m=+97.169946643 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert") pod "ingress-canary-b42mv" (UID: "db70a090-7023-443d-b909-09cc5a489c13") : secret "canary-serving-cert" not found Apr 22 19:24:23.473003 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:23.472947 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:24:23.473389 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:23.473114 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:24:23.473389 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:23.473197 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs podName:9c0f6922-8799-4caa-adfb-fa958fee9291 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:27.47318125 +0000 UTC m=+130.278022531 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs") pod "network-metrics-daemon-qwbg8" (UID: "9c0f6922-8799-4caa-adfb-fa958fee9291") : secret "metrics-daemon-secret" not found Apr 22 19:24:26.968970 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:26.968931 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xkr7v" Apr 22 19:24:27.201695 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:27.201646 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:24:27.201924 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:27.201804 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:27.201924 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:27.201823 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc5fdf5c-sl6kr: secret "image-registry-tls" not found Apr 22 19:24:27.201924 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:27.201901 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls podName:14a0ecfb-0f63-431b-84a2-7ef9ded759dd nodeName:}" failed. No retries permitted until 2026-04-22 19:24:59.201882111 +0000 UTC m=+102.006723392 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls") pod "image-registry-57dc5fdf5c-sl6kr" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd") : secret "image-registry-tls" not found Apr 22 19:24:54.393442 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:54.393394 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:24:54.393816 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:54.393468 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:24:54.393816 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:54.393547 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:54.393816 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:54.393617 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert podName:db70a090-7023-443d-b909-09cc5a489c13 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:58.393600098 +0000 UTC m=+161.198441380 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert") pod "ingress-canary-b42mv" (UID: "db70a090-7023-443d-b909-09cc5a489c13") : secret "canary-serving-cert" not found Apr 22 19:24:54.393816 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:54.393556 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:54.393816 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:54.393678 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls podName:4cbce8ae-11e1-44fb-a76c-a617c14a01cb nodeName:}" failed. No retries permitted until 2026-04-22 19:25:58.393665662 +0000 UTC m=+161.198506944 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls") pod "dns-default-nrq92" (UID: "4cbce8ae-11e1-44fb-a76c-a617c14a01cb") : secret "dns-default-metrics-tls" not found Apr 22 19:24:59.227849 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:24:59.227812 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls\") pod \"image-registry-57dc5fdf5c-sl6kr\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:24:59.228241 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:59.227928 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:24:59.228241 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:59.227942 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57dc5fdf5c-sl6kr: secret "image-registry-tls" not found Apr 22 19:24:59.228241 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:24:59.227995 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls podName:14a0ecfb-0f63-431b-84a2-7ef9ded759dd nodeName:}" failed. No retries permitted until 2026-04-22 19:26:03.227979935 +0000 UTC m=+166.032821217 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls") pod "image-registry-57dc5fdf5c-sl6kr" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd") : secret "image-registry-tls" not found Apr 22 19:25:13.630035 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.630000 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-g5jd8"] Apr 22 19:25:13.632981 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.632962 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.636120 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.636094 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-pz4cs\"" Apr 22 19:25:13.636120 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.636111 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 19:25:13.636294 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.636128 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:25:13.636294 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.636095 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:25:13.636294 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.636094 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 19:25:13.642579 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.642553 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 19:25:13.643150 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.643131 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-g5jd8"] Apr 22 19:25:13.733481 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.733445 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c6aa8798-c003-4836-b89b-2ec659893918-tmp\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.733677 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.733509 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6aa8798-c003-4836-b89b-2ec659893918-serving-cert\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.733677 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.733558 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c6aa8798-c003-4836-b89b-2ec659893918-snapshots\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.733677 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.733647 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjv2p\" (UniqueName: \"kubernetes.io/projected/c6aa8798-c003-4836-b89b-2ec659893918-kube-api-access-jjv2p\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.733872 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.733682 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6aa8798-c003-4836-b89b-2ec659893918-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.733872 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.733705 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6aa8798-c003-4836-b89b-2ec659893918-service-ca-bundle\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.734363 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.734339 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr"] Apr 22 19:25:13.737119 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.737098 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-55dd5f6bfb-9bmd5"] Apr 22 19:25:13.737237 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.737226 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:13.739812 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.739794 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.742223 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.742202 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-dzxfn\"" Apr 22 19:25:13.742319 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.742212 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 19:25:13.742319 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.742273 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:25:13.742634 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.742619 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:25:13.742705 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.742673 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 19:25:13.743844 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.743795 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 19:25:13.743909 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.743886 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 19:25:13.743980 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.743910 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-lcwqc\"" Apr 22 19:25:13.743980 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.743958 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 19:25:13.744091 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.743882 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 19:25:13.744091 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.743813 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 19:25:13.744091 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.743804 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 19:25:13.748600 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.748579 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr"] Apr 22 19:25:13.751833 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.751813 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-55dd5f6bfb-9bmd5"] Apr 22 19:25:13.835194 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835156 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjv2p\" (UniqueName: \"kubernetes.io/projected/c6aa8798-c003-4836-b89b-2ec659893918-kube-api-access-jjv2p\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.835194 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835201 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-stats-auth\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.835447 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835264 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6aa8798-c003-4836-b89b-2ec659893918-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.835447 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835421 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9pf2\" (UniqueName: \"kubernetes.io/projected/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-kube-api-access-r9pf2\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:13.835562 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835468 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c6aa8798-c003-4836-b89b-2ec659893918-tmp\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.835562 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835501 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btz22\" (UniqueName: \"kubernetes.io/projected/da14fcd3-4263-4cf2-abf8-bdaccee3e441-kube-api-access-btz22\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.835562 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835534 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6aa8798-c003-4836-b89b-2ec659893918-serving-cert\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.835709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835573 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c6aa8798-c003-4836-b89b-2ec659893918-snapshots\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.835709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835658 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-default-certificate\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.835709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835693 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.835897 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835750 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6aa8798-c003-4836-b89b-2ec659893918-service-ca-bundle\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.835897 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835827 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:13.835991 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835927 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c6aa8798-c003-4836-b89b-2ec659893918-tmp\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.835991 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.835967 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:13.836127 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.836016 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.836218 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.836199 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c6aa8798-c003-4836-b89b-2ec659893918-snapshots\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.836292 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.836277 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6aa8798-c003-4836-b89b-2ec659893918-service-ca-bundle\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.836468 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.836443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6aa8798-c003-4836-b89b-2ec659893918-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.837939 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.837921 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6aa8798-c003-4836-b89b-2ec659893918-serving-cert\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.843541 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.843512 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjv2p\" (UniqueName: \"kubernetes.io/projected/c6aa8798-c003-4836-b89b-2ec659893918-kube-api-access-jjv2p\") pod \"insights-operator-585dfdc468-g5jd8\" (UID: \"c6aa8798-c003-4836-b89b-2ec659893918\") " pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.936719 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.936641 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-stats-auth\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.936719 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.936688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9pf2\" (UniqueName: \"kubernetes.io/projected/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-kube-api-access-r9pf2\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:13.936719 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.936713 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btz22\" (UniqueName: \"kubernetes.io/projected/da14fcd3-4263-4cf2-abf8-bdaccee3e441-kube-api-access-btz22\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.937002 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.936804 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-default-certificate\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.937002 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.936833 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.937002 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.936864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:13.937002 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.936890 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:13.937002 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.936918 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.937228 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:13.937005 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:13.937228 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:13.937023 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:25:13.937228 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:13.937034 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle podName:da14fcd3-4263-4cf2-abf8-bdaccee3e441 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:14.437012524 +0000 UTC m=+117.241853806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle") pod "router-default-55dd5f6bfb-9bmd5" (UID: "da14fcd3-4263-4cf2-abf8-bdaccee3e441") : configmap references non-existent config key: service-ca.crt Apr 22 19:25:13.937228 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:13.937078 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs podName:da14fcd3-4263-4cf2-abf8-bdaccee3e441 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:14.43706294 +0000 UTC m=+117.241904242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs") pod "router-default-55dd5f6bfb-9bmd5" (UID: "da14fcd3-4263-4cf2-abf8-bdaccee3e441") : secret "router-metrics-certs-default" not found Apr 22 19:25:13.937228 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:13.937096 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls podName:a0018d13-5e39-40ec-a8e1-bc62c3aeee0a nodeName:}" failed. No retries permitted until 2026-04-22 19:25:14.437086969 +0000 UTC m=+117.241928258 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-z82sr" (UID: "a0018d13-5e39-40ec-a8e1-bc62c3aeee0a") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:13.937548 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.937529 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:13.939018 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.938991 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-stats-auth\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.939453 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.939437 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-default-certificate\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.943373 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.943350 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-g5jd8" Apr 22 19:25:13.948682 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.948658 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btz22\" (UniqueName: \"kubernetes.io/projected/da14fcd3-4263-4cf2-abf8-bdaccee3e441-kube-api-access-btz22\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:13.948899 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:13.948879 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9pf2\" (UniqueName: \"kubernetes.io/projected/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-kube-api-access-r9pf2\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:14.066094 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:14.066061 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-g5jd8"] Apr 22 19:25:14.069332 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:25:14.069292 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6aa8798_c003_4836_b89b_2ec659893918.slice/crio-b1df80cffa6a88d4fc2796e99c64c819b5a9a46ab4ce9d78cd2a9b7ee1c99840 WatchSource:0}: Error finding container b1df80cffa6a88d4fc2796e99c64c819b5a9a46ab4ce9d78cd2a9b7ee1c99840: Status 404 returned error can't find the container with id b1df80cffa6a88d4fc2796e99c64c819b5a9a46ab4ce9d78cd2a9b7ee1c99840 Apr 22 19:25:14.116943 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:14.116899 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-g5jd8" event={"ID":"c6aa8798-c003-4836-b89b-2ec659893918","Type":"ContainerStarted","Data":"b1df80cffa6a88d4fc2796e99c64c819b5a9a46ab4ce9d78cd2a9b7ee1c99840"} Apr 22 19:25:14.440596 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:14.440553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:14.440836 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:14.440607 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:14.440836 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:14.440644 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:14.440836 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:14.440743 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle podName:da14fcd3-4263-4cf2-abf8-bdaccee3e441 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:15.440707511 +0000 UTC m=+118.245548794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle") pod "router-default-55dd5f6bfb-9bmd5" (UID: "da14fcd3-4263-4cf2-abf8-bdaccee3e441") : configmap references non-existent config key: service-ca.crt Apr 22 19:25:14.440836 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:14.440809 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:25:14.441031 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:14.440865 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs podName:da14fcd3-4263-4cf2-abf8-bdaccee3e441 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:15.440849254 +0000 UTC m=+118.245690536 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs") pod "router-default-55dd5f6bfb-9bmd5" (UID: "da14fcd3-4263-4cf2-abf8-bdaccee3e441") : secret "router-metrics-certs-default" not found Apr 22 19:25:14.441031 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:14.440809 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:14.441031 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:14.440920 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls podName:a0018d13-5e39-40ec-a8e1-bc62c3aeee0a nodeName:}" failed. No retries permitted until 2026-04-22 19:25:15.440909175 +0000 UTC m=+118.245750462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-z82sr" (UID: "a0018d13-5e39-40ec-a8e1-bc62c3aeee0a") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:15.448896 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:15.448859 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:15.448896 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:15.448906 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:15.449391 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:15.448927 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:15.449391 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:15.449023 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:15.449391 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:15.449046 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle podName:da14fcd3-4263-4cf2-abf8-bdaccee3e441 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:17.449021077 +0000 UTC m=+120.253862365 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle") pod "router-default-55dd5f6bfb-9bmd5" (UID: "da14fcd3-4263-4cf2-abf8-bdaccee3e441") : configmap references non-existent config key: service-ca.crt Apr 22 19:25:15.449391 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:15.449084 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls podName:a0018d13-5e39-40ec-a8e1-bc62c3aeee0a nodeName:}" failed. No retries permitted until 2026-04-22 19:25:17.44906576 +0000 UTC m=+120.253907042 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-z82sr" (UID: "a0018d13-5e39-40ec-a8e1-bc62c3aeee0a") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:15.449391 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:15.449029 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:25:15.449391 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:15.449121 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs podName:da14fcd3-4263-4cf2-abf8-bdaccee3e441 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:17.449113035 +0000 UTC m=+120.253954316 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs") pod "router-default-55dd5f6bfb-9bmd5" (UID: "da14fcd3-4263-4cf2-abf8-bdaccee3e441") : secret "router-metrics-certs-default" not found Apr 22 19:25:16.123074 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:16.123036 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-g5jd8" event={"ID":"c6aa8798-c003-4836-b89b-2ec659893918","Type":"ContainerStarted","Data":"677ffafee87eb2cdce13cbc7d1ac9f42fd96d2411a88b6567d47f2b7c403476b"} Apr 22 19:25:16.141886 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:16.141836 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-g5jd8" podStartSLOduration=1.253123141 podStartE2EDuration="3.141819608s" podCreationTimestamp="2026-04-22 19:25:13 +0000 UTC" firstStartedPulling="2026-04-22 19:25:14.071042468 +0000 UTC m=+116.875883750" lastFinishedPulling="2026-04-22 19:25:15.959738922 +0000 UTC m=+118.764580217" observedRunningTime="2026-04-22 19:25:16.140111009 +0000 UTC m=+118.944952313" watchObservedRunningTime="2026-04-22 19:25:16.141819608 +0000 UTC m=+118.946660912" Apr 22 19:25:17.467460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:17.467416 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:17.467460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:17.467459 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:17.467900 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:17.467536 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:17.467900 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:17.467590 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:17.467900 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:17.467622 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:25:17.467900 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:17.467638 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle podName:da14fcd3-4263-4cf2-abf8-bdaccee3e441 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:21.467625245 +0000 UTC m=+124.272466527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle") pod "router-default-55dd5f6bfb-9bmd5" (UID: "da14fcd3-4263-4cf2-abf8-bdaccee3e441") : configmap references non-existent config key: service-ca.crt Apr 22 19:25:17.467900 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:17.467663 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls podName:a0018d13-5e39-40ec-a8e1-bc62c3aeee0a nodeName:}" failed. No retries permitted until 2026-04-22 19:25:21.467644771 +0000 UTC m=+124.272486059 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-z82sr" (UID: "a0018d13-5e39-40ec-a8e1-bc62c3aeee0a") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:17.467900 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:17.467682 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs podName:da14fcd3-4263-4cf2-abf8-bdaccee3e441 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:21.467673219 +0000 UTC m=+124.272514501 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs") pod "router-default-55dd5f6bfb-9bmd5" (UID: "da14fcd3-4263-4cf2-abf8-bdaccee3e441") : secret "router-metrics-certs-default" not found Apr 22 19:25:19.206780 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:19.206752 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5bh88_73366f6c-dc1e-4c5b-a1f3-e3d7839a351e/dns-node-resolver/0.log" Apr 22 19:25:20.600941 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:20.600909 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-drfzw_54216124-3633-4740-9592-06c935cb0781/node-ca/0.log" Apr 22 19:25:21.502007 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:21.501958 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:21.502007 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:21.502006 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:21.502217 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:21.502081 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:21.502217 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:21.502133 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:21.502217 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:21.502157 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:25:21.502217 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:21.502204 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle podName:da14fcd3-4263-4cf2-abf8-bdaccee3e441 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:29.502187301 +0000 UTC m=+132.307028587 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle") pod "router-default-55dd5f6bfb-9bmd5" (UID: "da14fcd3-4263-4cf2-abf8-bdaccee3e441") : configmap references non-existent config key: service-ca.crt Apr 22 19:25:21.502352 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:21.502221 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs podName:da14fcd3-4263-4cf2-abf8-bdaccee3e441 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:29.50221437 +0000 UTC m=+132.307055652 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs") pod "router-default-55dd5f6bfb-9bmd5" (UID: "da14fcd3-4263-4cf2-abf8-bdaccee3e441") : secret "router-metrics-certs-default" not found Apr 22 19:25:21.502352 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:21.502231 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls podName:a0018d13-5e39-40ec-a8e1-bc62c3aeee0a nodeName:}" failed. No retries permitted until 2026-04-22 19:25:29.502225925 +0000 UTC m=+132.307067207 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-z82sr" (UID: "a0018d13-5e39-40ec-a8e1-bc62c3aeee0a") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:23.692914 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.692873 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82bbd"] Apr 22 19:25:23.696037 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.696020 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82bbd" Apr 22 19:25:23.698776 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.698747 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:25:23.698776 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.698747 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-rbcxg\"" Apr 22 19:25:23.699874 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.699860 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 19:25:23.702314 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.702179 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq"] Apr 22 19:25:23.705101 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.705082 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" Apr 22 19:25:23.706082 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.706064 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82bbd"] Apr 22 19:25:23.708810 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.708782 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 19:25:23.708914 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.708817 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-qc26f\"" Apr 22 19:25:23.708914 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.708836 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 19:25:23.708914 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.708797 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 19:25:23.708914 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.708887 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:25:23.719491 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.719461 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq"] Apr 22 19:25:23.823892 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.823853 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e672d5c5-46b6-4d03-88de-c5f1dd4735ad-config\") pod \"service-ca-operator-d6fc45fc5-8znjq\" (UID: \"e672d5c5-46b6-4d03-88de-c5f1dd4735ad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" Apr 22 19:25:23.824076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.823910 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frglv\" (UniqueName: \"kubernetes.io/projected/0146a304-7dc4-4714-9289-ca9e3f151e55-kube-api-access-frglv\") pod \"volume-data-source-validator-7c6cbb6c87-82bbd\" (UID: \"0146a304-7dc4-4714-9289-ca9e3f151e55\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82bbd" Apr 22 19:25:23.824076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.823950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdxj8\" (UniqueName: \"kubernetes.io/projected/e672d5c5-46b6-4d03-88de-c5f1dd4735ad-kube-api-access-xdxj8\") pod \"service-ca-operator-d6fc45fc5-8znjq\" (UID: \"e672d5c5-46b6-4d03-88de-c5f1dd4735ad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" Apr 22 19:25:23.824076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.824023 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e672d5c5-46b6-4d03-88de-c5f1dd4735ad-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8znjq\" (UID: \"e672d5c5-46b6-4d03-88de-c5f1dd4735ad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" Apr 22 19:25:23.925064 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.925033 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e672d5c5-46b6-4d03-88de-c5f1dd4735ad-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8znjq\" (UID: \"e672d5c5-46b6-4d03-88de-c5f1dd4735ad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" Apr 22 19:25:23.925248 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.925102 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e672d5c5-46b6-4d03-88de-c5f1dd4735ad-config\") pod \"service-ca-operator-d6fc45fc5-8znjq\" (UID: \"e672d5c5-46b6-4d03-88de-c5f1dd4735ad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" Apr 22 19:25:23.925248 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.925135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frglv\" (UniqueName: \"kubernetes.io/projected/0146a304-7dc4-4714-9289-ca9e3f151e55-kube-api-access-frglv\") pod \"volume-data-source-validator-7c6cbb6c87-82bbd\" (UID: \"0146a304-7dc4-4714-9289-ca9e3f151e55\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82bbd" Apr 22 19:25:23.925248 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.925154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdxj8\" (UniqueName: \"kubernetes.io/projected/e672d5c5-46b6-4d03-88de-c5f1dd4735ad-kube-api-access-xdxj8\") pod \"service-ca-operator-d6fc45fc5-8znjq\" (UID: \"e672d5c5-46b6-4d03-88de-c5f1dd4735ad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" Apr 22 19:25:23.925716 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.925694 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e672d5c5-46b6-4d03-88de-c5f1dd4735ad-config\") pod \"service-ca-operator-d6fc45fc5-8znjq\" (UID: \"e672d5c5-46b6-4d03-88de-c5f1dd4735ad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" Apr 22 19:25:23.927261 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.927236 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e672d5c5-46b6-4d03-88de-c5f1dd4735ad-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8znjq\" (UID: \"e672d5c5-46b6-4d03-88de-c5f1dd4735ad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" Apr 22 19:25:23.933125 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.933105 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdxj8\" (UniqueName: \"kubernetes.io/projected/e672d5c5-46b6-4d03-88de-c5f1dd4735ad-kube-api-access-xdxj8\") pod \"service-ca-operator-d6fc45fc5-8znjq\" (UID: \"e672d5c5-46b6-4d03-88de-c5f1dd4735ad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" Apr 22 19:25:23.933566 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:23.933548 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frglv\" (UniqueName: \"kubernetes.io/projected/0146a304-7dc4-4714-9289-ca9e3f151e55-kube-api-access-frglv\") pod \"volume-data-source-validator-7c6cbb6c87-82bbd\" (UID: \"0146a304-7dc4-4714-9289-ca9e3f151e55\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82bbd" Apr 22 19:25:24.004810 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:24.004710 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82bbd" Apr 22 19:25:24.014684 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:24.014655 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" Apr 22 19:25:24.131803 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:24.131767 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82bbd"] Apr 22 19:25:24.134529 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:25:24.134482 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0146a304_7dc4_4714_9289_ca9e3f151e55.slice/crio-3c59e0744e9126993996da1e6950c3f24aa8a8f52aa2db02b10d7435ff97af86 WatchSource:0}: Error finding container 3c59e0744e9126993996da1e6950c3f24aa8a8f52aa2db02b10d7435ff97af86: Status 404 returned error can't find the container with id 3c59e0744e9126993996da1e6950c3f24aa8a8f52aa2db02b10d7435ff97af86 Apr 22 19:25:24.140161 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:24.140135 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82bbd" event={"ID":"0146a304-7dc4-4714-9289-ca9e3f151e55","Type":"ContainerStarted","Data":"3c59e0744e9126993996da1e6950c3f24aa8a8f52aa2db02b10d7435ff97af86"} Apr 22 19:25:24.146350 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:24.146328 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq"] Apr 22 19:25:24.149146 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:25:24.149121 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode672d5c5_46b6_4d03_88de_c5f1dd4735ad.slice/crio-96ece5490f5e55d6e4c02cb47bc1b016f656cbf5550e27c98728f815defaf647 WatchSource:0}: Error finding container 96ece5490f5e55d6e4c02cb47bc1b016f656cbf5550e27c98728f815defaf647: Status 404 returned error can't find the container with id 96ece5490f5e55d6e4c02cb47bc1b016f656cbf5550e27c98728f815defaf647 Apr 22 19:25:25.144100 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:25.144035 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" event={"ID":"e672d5c5-46b6-4d03-88de-c5f1dd4735ad","Type":"ContainerStarted","Data":"96ece5490f5e55d6e4c02cb47bc1b016f656cbf5550e27c98728f815defaf647"} Apr 22 19:25:26.147323 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:26.147282 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" event={"ID":"e672d5c5-46b6-4d03-88de-c5f1dd4735ad","Type":"ContainerStarted","Data":"4807b969d879dd0e8102659fcb265bdfea7f066f5273f1fee670be427a891c83"} Apr 22 19:25:26.148598 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:26.148576 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82bbd" event={"ID":"0146a304-7dc4-4714-9289-ca9e3f151e55","Type":"ContainerStarted","Data":"01a24a14a1120cbc05b0b9d9b65add19adc53b501b3b72f5ea4b508cdf6779d8"} Apr 22 19:25:26.163527 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:26.163480 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" podStartSLOduration=1.307762028 podStartE2EDuration="3.163469062s" podCreationTimestamp="2026-04-22 19:25:23 +0000 UTC" firstStartedPulling="2026-04-22 19:25:24.150943509 +0000 UTC m=+126.955784794" lastFinishedPulling="2026-04-22 19:25:26.006650546 +0000 UTC m=+128.811491828" observedRunningTime="2026-04-22 19:25:26.16225369 +0000 UTC m=+128.967095019" watchObservedRunningTime="2026-04-22 19:25:26.163469062 +0000 UTC m=+128.968310363" Apr 22 19:25:26.178928 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:26.178877 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82bbd" podStartSLOduration=1.31416299 podStartE2EDuration="3.178863295s" podCreationTimestamp="2026-04-22 19:25:23 +0000 UTC" firstStartedPulling="2026-04-22 19:25:24.136561314 +0000 UTC m=+126.941402596" lastFinishedPulling="2026-04-22 19:25:26.001261619 +0000 UTC m=+128.806102901" observedRunningTime="2026-04-22 19:25:26.178045225 +0000 UTC m=+128.982886529" watchObservedRunningTime="2026-04-22 19:25:26.178863295 +0000 UTC m=+128.983704599" Apr 22 19:25:27.555674 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:27.555639 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:25:27.556076 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:27.555807 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:25:27.556076 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:27.555868 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs podName:9c0f6922-8799-4caa-adfb-fa958fee9291 nodeName:}" failed. No retries permitted until 2026-04-22 19:27:29.555853438 +0000 UTC m=+252.360694724 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs") pod "network-metrics-daemon-qwbg8" (UID: "9c0f6922-8799-4caa-adfb-fa958fee9291") : secret "metrics-daemon-secret" not found Apr 22 19:25:28.761269 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:28.761231 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wzvx6"] Apr 22 19:25:28.764271 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:28.764252 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wzvx6" Apr 22 19:25:28.766875 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:28.766849 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 19:25:28.767002 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:28.766899 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 19:25:28.768006 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:28.767990 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-ntgk2\"" Apr 22 19:25:28.773099 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:28.773074 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wzvx6"] Apr 22 19:25:28.865215 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:28.865181 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4hx6\" (UniqueName: \"kubernetes.io/projected/a21b65b1-0841-47bb-a561-5ca5bb8578a0-kube-api-access-h4hx6\") pod \"migrator-74bb7799d9-wzvx6\" (UID: \"a21b65b1-0841-47bb-a561-5ca5bb8578a0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wzvx6" Apr 22 19:25:28.966018 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:28.965976 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4hx6\" (UniqueName: \"kubernetes.io/projected/a21b65b1-0841-47bb-a561-5ca5bb8578a0-kube-api-access-h4hx6\") pod \"migrator-74bb7799d9-wzvx6\" (UID: \"a21b65b1-0841-47bb-a561-5ca5bb8578a0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wzvx6" Apr 22 19:25:28.974234 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:28.974198 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4hx6\" (UniqueName: \"kubernetes.io/projected/a21b65b1-0841-47bb-a561-5ca5bb8578a0-kube-api-access-h4hx6\") pod \"migrator-74bb7799d9-wzvx6\" (UID: \"a21b65b1-0841-47bb-a561-5ca5bb8578a0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wzvx6" Apr 22 19:25:29.073400 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:29.073314 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wzvx6" Apr 22 19:25:29.202748 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:29.202700 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wzvx6"] Apr 22 19:25:29.206595 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:25:29.206547 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda21b65b1_0841_47bb_a561_5ca5bb8578a0.slice/crio-f9c26f54df56620301a1842a66c6c1cff1dd51e614ad43f778cdf3ab7b42c43d WatchSource:0}: Error finding container f9c26f54df56620301a1842a66c6c1cff1dd51e614ad43f778cdf3ab7b42c43d: Status 404 returned error can't find the container with id f9c26f54df56620301a1842a66c6c1cff1dd51e614ad43f778cdf3ab7b42c43d Apr 22 19:25:29.572400 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:29.572356 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:29.572594 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:29.572409 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:29.572594 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:29.572443 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:29.572594 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:29.572545 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle podName:da14fcd3-4263-4cf2-abf8-bdaccee3e441 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:45.572523455 +0000 UTC m=+148.377364756 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle") pod "router-default-55dd5f6bfb-9bmd5" (UID: "da14fcd3-4263-4cf2-abf8-bdaccee3e441") : configmap references non-existent config key: service-ca.crt Apr 22 19:25:29.572594 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:29.572556 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:25:29.572594 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:29.572595 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs podName:da14fcd3-4263-4cf2-abf8-bdaccee3e441 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:45.572583872 +0000 UTC m=+148.377425154 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs") pod "router-default-55dd5f6bfb-9bmd5" (UID: "da14fcd3-4263-4cf2-abf8-bdaccee3e441") : secret "router-metrics-certs-default" not found Apr 22 19:25:29.572847 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:29.572609 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:29.572847 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:29.572697 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls podName:a0018d13-5e39-40ec-a8e1-bc62c3aeee0a nodeName:}" failed. No retries permitted until 2026-04-22 19:25:45.572677274 +0000 UTC m=+148.377518572 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-z82sr" (UID: "a0018d13-5e39-40ec-a8e1-bc62c3aeee0a") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:25:30.167711 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:30.167307 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wzvx6" event={"ID":"a21b65b1-0841-47bb-a561-5ca5bb8578a0","Type":"ContainerStarted","Data":"f9c26f54df56620301a1842a66c6c1cff1dd51e614ad43f778cdf3ab7b42c43d"} Apr 22 19:25:31.171158 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:31.171123 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wzvx6" event={"ID":"a21b65b1-0841-47bb-a561-5ca5bb8578a0","Type":"ContainerStarted","Data":"6160731b5aec672908b6475896f746667679ccc5a391d80c282fc17b6dea5c93"} Apr 22 19:25:31.171158 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:31.171162 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wzvx6" event={"ID":"a21b65b1-0841-47bb-a561-5ca5bb8578a0","Type":"ContainerStarted","Data":"6b0b96ee899931eccb5b3e5e5b89296266f0d864e4f07de2fa9fa4e0ce7b856b"} Apr 22 19:25:31.189074 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:31.189021 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wzvx6" podStartSLOduration=1.785196239 podStartE2EDuration="3.189005906s" podCreationTimestamp="2026-04-22 19:25:28 +0000 UTC" firstStartedPulling="2026-04-22 19:25:29.208568907 +0000 UTC m=+132.013410193" lastFinishedPulling="2026-04-22 19:25:30.61237857 +0000 UTC m=+133.417219860" observedRunningTime="2026-04-22 19:25:31.188327926 +0000 UTC m=+133.993169230" watchObservedRunningTime="2026-04-22 19:25:31.189005906 +0000 UTC m=+133.993847213" Apr 22 19:25:45.608021 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:45.607943 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:45.608021 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:45.608033 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:45.608448 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:45.608056 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:45.609609 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:45.609581 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da14fcd3-4263-4cf2-abf8-bdaccee3e441-service-ca-bundle\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:45.610382 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:45.610362 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da14fcd3-4263-4cf2-abf8-bdaccee3e441-metrics-certs\") pod \"router-default-55dd5f6bfb-9bmd5\" (UID: \"da14fcd3-4263-4cf2-abf8-bdaccee3e441\") " pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:45.610512 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:45.610491 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0018d13-5e39-40ec-a8e1-bc62c3aeee0a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-z82sr\" (UID: \"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:45.850919 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:45.850888 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-dzxfn\"" Apr 22 19:25:45.855842 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:45.855822 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-lcwqc\"" Apr 22 19:25:45.858970 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:45.858920 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" Apr 22 19:25:45.863617 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:45.863593 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:46.001927 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:46.001888 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr"] Apr 22 19:25:46.005948 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:25:46.005918 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0018d13_5e39_40ec_a8e1_bc62c3aeee0a.slice/crio-2df39c340340872e569cc28dbb13001409cb890a530e7efdfe6a72de4e3fa60f WatchSource:0}: Error finding container 2df39c340340872e569cc28dbb13001409cb890a530e7efdfe6a72de4e3fa60f: Status 404 returned error can't find the container with id 2df39c340340872e569cc28dbb13001409cb890a530e7efdfe6a72de4e3fa60f Apr 22 19:25:46.022135 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:46.022113 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-55dd5f6bfb-9bmd5"] Apr 22 19:25:46.024581 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:25:46.024559 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda14fcd3_4263_4cf2_abf8_bdaccee3e441.slice/crio-03df61e5ad1863c5aad5524c631bfbd20e13622ee65fad8402b87788f1fbf6c7 WatchSource:0}: Error finding container 03df61e5ad1863c5aad5524c631bfbd20e13622ee65fad8402b87788f1fbf6c7: Status 404 returned error can't find the container with id 03df61e5ad1863c5aad5524c631bfbd20e13622ee65fad8402b87788f1fbf6c7 Apr 22 19:25:46.208800 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:46.208697 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" event={"ID":"da14fcd3-4263-4cf2-abf8-bdaccee3e441","Type":"ContainerStarted","Data":"ee4eefcff82c2f46e549c05d4ca64c4080bb473c60f2f19dac4e8f6000e15b86"} Apr 22 19:25:46.208800 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:46.208761 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" event={"ID":"da14fcd3-4263-4cf2-abf8-bdaccee3e441","Type":"ContainerStarted","Data":"03df61e5ad1863c5aad5524c631bfbd20e13622ee65fad8402b87788f1fbf6c7"} Apr 22 19:25:46.209898 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:46.209861 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" event={"ID":"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a","Type":"ContainerStarted","Data":"2df39c340340872e569cc28dbb13001409cb890a530e7efdfe6a72de4e3fa60f"} Apr 22 19:25:46.229494 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:46.229449 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" podStartSLOduration=33.229434489 podStartE2EDuration="33.229434489s" podCreationTimestamp="2026-04-22 19:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:25:46.228515976 +0000 UTC m=+149.033357281" watchObservedRunningTime="2026-04-22 19:25:46.229434489 +0000 UTC m=+149.034275839" Apr 22 19:25:46.864443 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:46.864322 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:46.867398 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:46.867369 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:47.213241 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:47.213146 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:47.214584 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:47.214550 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-55dd5f6bfb-9bmd5" Apr 22 19:25:48.216679 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:48.216647 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" event={"ID":"a0018d13-5e39-40ec-a8e1-bc62c3aeee0a","Type":"ContainerStarted","Data":"c43d1355bd383135ec2223505a18d0b69694117b788721269fd063f5c430b97c"} Apr 22 19:25:48.233151 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:48.233101 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-z82sr" podStartSLOduration=33.402032743 podStartE2EDuration="35.233086231s" podCreationTimestamp="2026-04-22 19:25:13 +0000 UTC" firstStartedPulling="2026-04-22 19:25:46.007701834 +0000 UTC m=+148.812543119" lastFinishedPulling="2026-04-22 19:25:47.838755326 +0000 UTC m=+150.643596607" observedRunningTime="2026-04-22 19:25:48.232279062 +0000 UTC m=+151.037120365" watchObservedRunningTime="2026-04-22 19:25:48.233086231 +0000 UTC m=+151.037927534" Apr 22 19:25:52.471786 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.471746 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-x8ngq"] Apr 22 19:25:52.475037 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.475021 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.477762 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.477718 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-np8h6\"" Apr 22 19:25:52.479086 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.479072 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:25:52.479152 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.479094 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:25:52.488608 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.488578 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x8ngq"] Apr 22 19:25:52.537109 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.537075 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57dc5fdf5c-sl6kr"] Apr 22 19:25:52.537303 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:52.537283 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" podUID="14a0ecfb-0f63-431b-84a2-7ef9ded759dd" Apr 22 19:25:52.568290 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.568233 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-data-volume\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.568290 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.568286 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.568523 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.568358 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-678b6\" (UniqueName: \"kubernetes.io/projected/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-kube-api-access-678b6\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.568523 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.568437 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-crio-socket\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.568620 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.568514 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.669088 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.669047 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-data-volume\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.669088 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.669093 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.669359 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.669125 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-678b6\" (UniqueName: \"kubernetes.io/projected/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-kube-api-access-678b6\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.669359 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.669162 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-crio-socket\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.669359 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.669243 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.669517 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.669353 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-crio-socket\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.669773 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.669723 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-data-volume\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.669885 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.669869 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.671616 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.671595 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.680633 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.680608 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-c9898b8d8-4dftw"] Apr 22 19:25:52.683559 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.683544 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.690166 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.690144 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-678b6\" (UniqueName: \"kubernetes.io/projected/ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef-kube-api-access-678b6\") pod \"insights-runtime-extractor-x8ngq\" (UID: \"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef\") " pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.702109 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.702082 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c9898b8d8-4dftw"] Apr 22 19:25:52.770705 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.770604 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zd9l\" (UniqueName: \"kubernetes.io/projected/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-kube-api-access-7zd9l\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.770896 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.770705 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-registry-certificates\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.770896 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.770783 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-trusted-ca\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.770896 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.770805 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-bound-sa-token\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.770896 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.770867 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-installation-pull-secrets\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.770896 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.770893 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-registry-tls\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.771061 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.770912 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-ca-trust-extracted\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.771061 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.771003 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-image-registry-private-configuration\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.784156 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.784131 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x8ngq" Apr 22 19:25:52.871363 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.871324 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-trusted-ca\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.871546 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.871369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-bound-sa-token\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.871546 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.871398 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-installation-pull-secrets\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.871546 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.871428 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-registry-tls\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.871546 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.871452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-ca-trust-extracted\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.871546 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.871535 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-image-registry-private-configuration\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.871829 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.871567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zd9l\" (UniqueName: \"kubernetes.io/projected/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-kube-api-access-7zd9l\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.871829 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.871612 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-registry-certificates\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.872402 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.872038 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-ca-trust-extracted\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.872545 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.872476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-registry-certificates\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.873585 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.872706 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-trusted-ca\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.875093 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.875024 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-image-registry-private-configuration\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.875197 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.875098 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-registry-tls\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.875508 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.875473 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-installation-pull-secrets\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.883366 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.883344 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-bound-sa-token\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.883577 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.883552 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zd9l\" (UniqueName: \"kubernetes.io/projected/463b4c94-dbcc-4e22-8c34-c3a19a07bc68-kube-api-access-7zd9l\") pod \"image-registry-c9898b8d8-4dftw\" (UID: \"463b4c94-dbcc-4e22-8c34-c3a19a07bc68\") " pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:52.905121 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:52.905087 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x8ngq"] Apr 22 19:25:52.908942 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:25:52.908912 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebddd58c_975e_4d0e_ab4c_b9a7564ac2ef.slice/crio-a255635d46e0035cc839d384b788b706668467373a3347bad6389556534df214 WatchSource:0}: Error finding container a255635d46e0035cc839d384b788b706668467373a3347bad6389556534df214: Status 404 returned error can't find the container with id a255635d46e0035cc839d384b788b706668467373a3347bad6389556534df214 Apr 22 19:25:53.002967 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.002934 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2jtv7\"" Apr 22 19:25:53.011337 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.011312 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:53.136910 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.136875 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c9898b8d8-4dftw"] Apr 22 19:25:53.140956 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:25:53.140920 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod463b4c94_dbcc_4e22_8c34_c3a19a07bc68.slice/crio-9e83f548a5b379cd58becddc69594d17f11f36ae7850cd29bb88b291792d3958 WatchSource:0}: Error finding container 9e83f548a5b379cd58becddc69594d17f11f36ae7850cd29bb88b291792d3958: Status 404 returned error can't find the container with id 9e83f548a5b379cd58becddc69594d17f11f36ae7850cd29bb88b291792d3958 Apr 22 19:25:53.232587 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.232547 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x8ngq" event={"ID":"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef","Type":"ContainerStarted","Data":"aac632b5a6e4dc55303498acde9e54fedb56871de274a7a30685d0c6944fe14c"} Apr 22 19:25:53.232587 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.232589 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x8ngq" event={"ID":"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef","Type":"ContainerStarted","Data":"a255635d46e0035cc839d384b788b706668467373a3347bad6389556534df214"} Apr 22 19:25:53.233942 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.233919 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:25:53.233942 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.233930 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" event={"ID":"463b4c94-dbcc-4e22-8c34-c3a19a07bc68","Type":"ContainerStarted","Data":"9402458fba03ba87d16fdcd8dd9c44058141c5c7f4498b838f6c439f9797b05c"} Apr 22 19:25:53.234099 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.233957 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" event={"ID":"463b4c94-dbcc-4e22-8c34-c3a19a07bc68","Type":"ContainerStarted","Data":"9e83f548a5b379cd58becddc69594d17f11f36ae7850cd29bb88b291792d3958"} Apr 22 19:25:53.234099 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.234020 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:25:53.238214 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.238195 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:25:53.254240 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.254194 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" podStartSLOduration=1.254177397 podStartE2EDuration="1.254177397s" podCreationTimestamp="2026-04-22 19:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:25:53.253590785 +0000 UTC m=+156.058432089" watchObservedRunningTime="2026-04-22 19:25:53.254177397 +0000 UTC m=+156.059018702" Apr 22 19:25:53.376029 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.375996 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-image-registry-private-configuration\") pod \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " Apr 22 19:25:53.376192 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.376039 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-trusted-ca\") pod \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " Apr 22 19:25:53.376192 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.376056 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-bound-sa-token\") pod \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " Apr 22 19:25:53.376192 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.376094 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-certificates\") pod \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " Apr 22 19:25:53.376192 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.376125 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-installation-pull-secrets\") pod \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " Apr 22 19:25:53.376192 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.376154 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-ca-trust-extracted\") pod \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " Apr 22 19:25:53.376192 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.376184 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxz77\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-kube-api-access-cxz77\") pod \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\" (UID: \"14a0ecfb-0f63-431b-84a2-7ef9ded759dd\") " Apr 22 19:25:53.376509 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.376464 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "14a0ecfb-0f63-431b-84a2-7ef9ded759dd" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:53.376509 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.376472 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "14a0ecfb-0f63-431b-84a2-7ef9ded759dd" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:25:53.377121 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.377039 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "14a0ecfb-0f63-431b-84a2-7ef9ded759dd" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:25:53.377121 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.377049 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-trusted-ca\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:25:53.377121 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.377096 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-certificates\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:25:53.378931 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.378898 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "14a0ecfb-0f63-431b-84a2-7ef9ded759dd" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:25:53.379545 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.379517 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-kube-api-access-cxz77" (OuterVolumeSpecName: "kube-api-access-cxz77") pod "14a0ecfb-0f63-431b-84a2-7ef9ded759dd" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd"). InnerVolumeSpecName "kube-api-access-cxz77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:25:53.379545 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.379536 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "14a0ecfb-0f63-431b-84a2-7ef9ded759dd" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:25:53.379814 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.379783 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "14a0ecfb-0f63-431b-84a2-7ef9ded759dd" (UID: "14a0ecfb-0f63-431b-84a2-7ef9ded759dd"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:25:53.478473 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.478425 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cxz77\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-kube-api-access-cxz77\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:25:53.478473 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.478475 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-image-registry-private-configuration\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:25:53.478998 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.478491 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-bound-sa-token\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:25:53.478998 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.478506 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-installation-pull-secrets\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:25:53.478998 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:53.478521 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-ca-trust-extracted\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:25:53.557752 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:53.557694 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-nrq92" podUID="4cbce8ae-11e1-44fb-a76c-a617c14a01cb" Apr 22 19:25:53.563547 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:53.563509 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-b42mv" podUID="db70a090-7023-443d-b909-09cc5a489c13" Apr 22 19:25:53.814281 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:25:53.814248 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-qwbg8" podUID="9c0f6922-8799-4caa-adfb-fa958fee9291" Apr 22 19:25:54.239060 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:54.239017 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x8ngq" event={"ID":"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef","Type":"ContainerStarted","Data":"6399ae2e0991a20414795b40a58d1d2f07694a3ee24e5716a0f1dae9550317f3"} Apr 22 19:25:54.239238 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:54.239111 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57dc5fdf5c-sl6kr" Apr 22 19:25:54.239238 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:54.239114 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nrq92" Apr 22 19:25:54.239238 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:54.239112 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:25:54.274014 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:54.273978 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57dc5fdf5c-sl6kr"] Apr 22 19:25:54.280092 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:54.280060 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-57dc5fdf5c-sl6kr"] Apr 22 19:25:54.389070 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:54.389035 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14a0ecfb-0f63-431b-84a2-7ef9ded759dd-registry-tls\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:25:55.243430 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:55.243343 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x8ngq" event={"ID":"ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef","Type":"ContainerStarted","Data":"b98731f2589c5e1018a67063ae8f535eedc92d0608cf42f8b862bcc58a235693"} Apr 22 19:25:55.268597 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:55.268538 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-x8ngq" podStartSLOduration=1.313590742 podStartE2EDuration="3.268520651s" podCreationTimestamp="2026-04-22 19:25:52 +0000 UTC" firstStartedPulling="2026-04-22 19:25:52.96445788 +0000 UTC m=+155.769299161" lastFinishedPulling="2026-04-22 19:25:54.919387785 +0000 UTC m=+157.724229070" observedRunningTime="2026-04-22 19:25:55.266844984 +0000 UTC m=+158.071686288" watchObservedRunningTime="2026-04-22 19:25:55.268520651 +0000 UTC m=+158.073361955" Apr 22 19:25:55.805020 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:55.804983 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a0ecfb-0f63-431b-84a2-7ef9ded759dd" path="/var/lib/kubelet/pods/14a0ecfb-0f63-431b-84a2-7ef9ded759dd/volumes" Apr 22 19:25:58.421712 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:58.421673 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:25:58.422195 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:58.421764 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:25:58.424127 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:58.424106 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cbce8ae-11e1-44fb-a76c-a617c14a01cb-metrics-tls\") pod \"dns-default-nrq92\" (UID: \"4cbce8ae-11e1-44fb-a76c-a617c14a01cb\") " pod="openshift-dns/dns-default-nrq92" Apr 22 19:25:58.424213 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:58.424194 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db70a090-7023-443d-b909-09cc5a489c13-cert\") pod \"ingress-canary-b42mv\" (UID: \"db70a090-7023-443d-b909-09cc5a489c13\") " pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:25:58.443184 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:58.443131 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l8fgp\"" Apr 22 19:25:58.443961 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:58.443946 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f8zlb\"" Apr 22 19:25:58.450155 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:58.450133 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nrq92" Apr 22 19:25:58.450264 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:58.450134 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b42mv" Apr 22 19:25:58.584769 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:58.584718 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b42mv"] Apr 22 19:25:58.587764 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:25:58.587717 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb70a090_7023_443d_b909_09cc5a489c13.slice/crio-36a0c934157fbe61676b89425b9e051db8438651935f63366f9e4b735b2a753c WatchSource:0}: Error finding container 36a0c934157fbe61676b89425b9e051db8438651935f63366f9e4b735b2a753c: Status 404 returned error can't find the container with id 36a0c934157fbe61676b89425b9e051db8438651935f63366f9e4b735b2a753c Apr 22 19:25:58.598509 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:58.598484 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nrq92"] Apr 22 19:25:58.601333 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:25:58.601296 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cbce8ae_11e1_44fb_a76c_a617c14a01cb.slice/crio-d563119a72a43e5c5e52be0fffa621b5025db0de3622a9135f89c6d5417f8c7e WatchSource:0}: Error finding container d563119a72a43e5c5e52be0fffa621b5025db0de3622a9135f89c6d5417f8c7e: Status 404 returned error can't find the container with id d563119a72a43e5c5e52be0fffa621b5025db0de3622a9135f89c6d5417f8c7e Apr 22 19:25:59.254176 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.254134 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nrq92" event={"ID":"4cbce8ae-11e1-44fb-a76c-a617c14a01cb","Type":"ContainerStarted","Data":"d563119a72a43e5c5e52be0fffa621b5025db0de3622a9135f89c6d5417f8c7e"} Apr 22 19:25:59.255376 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.255335 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b42mv" event={"ID":"db70a090-7023-443d-b909-09cc5a489c13","Type":"ContainerStarted","Data":"36a0c934157fbe61676b89425b9e051db8438651935f63366f9e4b735b2a753c"} Apr 22 19:25:59.779335 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.779300 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bc8bbd8b5-x7lg8"] Apr 22 19:25:59.782767 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.782742 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:25:59.785598 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.785576 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:25:59.786931 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.786847 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:25:59.786931 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.786873 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:25:59.787148 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.786943 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:25:59.787148 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.787016 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:25:59.787148 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.787038 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qlpqn\"" Apr 22 19:25:59.787148 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.787048 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 19:25:59.787364 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.787194 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 19:25:59.794226 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.794203 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bc8bbd8b5-x7lg8"] Apr 22 19:25:59.806670 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.806641 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf"] Apr 22 19:25:59.810006 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.809980 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:25:59.812991 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.812966 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:25:59.813118 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.813014 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-xn8qw\"" Apr 22 19:25:59.813118 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.813046 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 19:25:59.813299 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.813160 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 19:25:59.822145 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.822118 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf"] Apr 22 19:25:59.829702 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.829672 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-srrrb"] Apr 22 19:25:59.833191 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.833168 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:25:59.837657 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.837631 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 19:25:59.838197 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.838175 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-mcc6k\"" Apr 22 19:25:59.838329 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.838260 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 19:25:59.838575 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.838470 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 19:25:59.863370 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.863337 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-srrrb"] Apr 22 19:25:59.864752 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.864692 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pq9pg"] Apr 22 19:25:59.868646 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.868616 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:25:59.872136 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.871757 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:25:59.872136 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.872029 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:25:59.872136 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.872135 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:25:59.872388 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.872172 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nfpmc\"" Apr 22 19:25:59.933869 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.933838 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-mqnsf\" (UID: \"a85b69a4-76cb-4a0b-aea8-8638b5db8f71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:25:59.933869 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.933873 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59ed8a36-517d-40fc-b340-cfe4a80582da-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:25:59.934078 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.933898 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-console-config\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:25:59.934078 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.933914 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-service-ca\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:25:59.934078 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.933941 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6bnh\" (UniqueName: \"kubernetes.io/projected/c4f56637-3af9-4406-911b-fb8449f565c5-kube-api-access-v6bnh\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:25:59.934078 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.933962 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-mqnsf\" (UID: \"a85b69a4-76cb-4a0b-aea8-8638b5db8f71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:25:59.934078 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.933993 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f56637-3af9-4406-911b-fb8449f565c5-console-serving-cert\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:25:59.934245 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.934131 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/59ed8a36-517d-40fc-b340-cfe4a80582da-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:25:59.934245 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.934204 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d87v\" (UniqueName: \"kubernetes.io/projected/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-kube-api-access-8d87v\") pod \"openshift-state-metrics-9d44df66c-mqnsf\" (UID: \"a85b69a4-76cb-4a0b-aea8-8638b5db8f71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:25:59.934317 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.934242 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/59ed8a36-517d-40fc-b340-cfe4a80582da-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:25:59.934317 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.934280 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/59ed8a36-517d-40fc-b340-cfe4a80582da-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:25:59.934396 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.934379 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4f56637-3af9-4406-911b-fb8449f565c5-console-oauth-config\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:25:59.934438 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.934410 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-oauth-serving-cert\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:25:59.934494 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.934454 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-mqnsf\" (UID: \"a85b69a4-76cb-4a0b-aea8-8638b5db8f71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:25:59.934541 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.934504 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59ed8a36-517d-40fc-b340-cfe4a80582da-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:25:59.934607 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:25:59.934574 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b76hd\" (UniqueName: \"kubernetes.io/projected/59ed8a36-517d-40fc-b340-cfe4a80582da-kube-api-access-b76hd\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.035946 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.035857 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4f56637-3af9-4406-911b-fb8449f565c5-console-oauth-config\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:00.035946 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.035903 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-oauth-serving-cert\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:00.035946 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.035936 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-tls\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.036225 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.035962 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-wtmp\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.036225 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.036103 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-mqnsf\" (UID: \"a85b69a4-76cb-4a0b-aea8-8638b5db8f71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:26:00.036225 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.036137 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59ed8a36-517d-40fc-b340-cfe4a80582da-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.036225 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.036157 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-sys\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.036225 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.036184 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b76hd\" (UniqueName: \"kubernetes.io/projected/59ed8a36-517d-40fc-b340-cfe4a80582da-kube-api-access-b76hd\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.036225 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.036212 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-accelerators-collector-config\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.036522 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.036241 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-mqnsf\" (UID: \"a85b69a4-76cb-4a0b-aea8-8638b5db8f71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:26:00.036522 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.036260 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59ed8a36-517d-40fc-b340-cfe4a80582da-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.036522 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.036283 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-textfile\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.036522 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.036303 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-console-config\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:00.036522 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.036328 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-root\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.036779 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.036353 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-metrics-client-ca\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.037476 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.037392 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59ed8a36-517d-40fc-b340-cfe4a80582da-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.037611 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.037555 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-mqnsf\" (UID: \"a85b69a4-76cb-4a0b-aea8-8638b5db8f71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:26:00.037611 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.037578 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-service-ca\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:00.038019 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.037996 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-oauth-serving-cert\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:00.038019 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.038013 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-console-config\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:00.038189 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.038064 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6bnh\" (UniqueName: \"kubernetes.io/projected/c4f56637-3af9-4406-911b-fb8449f565c5-kube-api-access-v6bnh\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:00.038189 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.038147 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-mqnsf\" (UID: \"a85b69a4-76cb-4a0b-aea8-8638b5db8f71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:26:00.038290 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.038185 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f56637-3af9-4406-911b-fb8449f565c5-console-serving-cert\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:00.038290 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.038227 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-service-ca\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:00.038395 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.038303 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/59ed8a36-517d-40fc-b340-cfe4a80582da-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.038395 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.038344 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8d87v\" (UniqueName: \"kubernetes.io/projected/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-kube-api-access-8d87v\") pod \"openshift-state-metrics-9d44df66c-mqnsf\" (UID: \"a85b69a4-76cb-4a0b-aea8-8638b5db8f71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:26:00.038395 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.038381 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/59ed8a36-517d-40fc-b340-cfe4a80582da-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.038537 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.038428 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/59ed8a36-517d-40fc-b340-cfe4a80582da-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.038537 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.038474 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.038659 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.038565 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bxtb\" (UniqueName: \"kubernetes.io/projected/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-kube-api-access-8bxtb\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.039111 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:26:00.038430 2579 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 19:26:00.039422 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:26:00.039404 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-openshift-state-metrics-tls podName:a85b69a4-76cb-4a0b-aea8-8638b5db8f71 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:00.53937154 +0000 UTC m=+163.344212822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-mqnsf" (UID: "a85b69a4-76cb-4a0b-aea8-8638b5db8f71") : secret "openshift-state-metrics-tls" not found Apr 22 19:26:00.039517 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.039470 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/59ed8a36-517d-40fc-b340-cfe4a80582da-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.040067 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.040039 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/59ed8a36-517d-40fc-b340-cfe4a80582da-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.041455 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.041426 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59ed8a36-517d-40fc-b340-cfe4a80582da-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.042063 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.042036 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-mqnsf\" (UID: \"a85b69a4-76cb-4a0b-aea8-8638b5db8f71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:26:00.043825 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.043802 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/59ed8a36-517d-40fc-b340-cfe4a80582da-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.045980 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.045934 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f56637-3af9-4406-911b-fb8449f565c5-console-serving-cert\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:00.046605 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.046572 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4f56637-3af9-4406-911b-fb8449f565c5-console-oauth-config\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:00.049790 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.049768 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6bnh\" (UniqueName: \"kubernetes.io/projected/c4f56637-3af9-4406-911b-fb8449f565c5-kube-api-access-v6bnh\") pod \"console-7bc8bbd8b5-x7lg8\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:00.049943 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.049914 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d87v\" (UniqueName: \"kubernetes.io/projected/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-kube-api-access-8d87v\") pod \"openshift-state-metrics-9d44df66c-mqnsf\" (UID: \"a85b69a4-76cb-4a0b-aea8-8638b5db8f71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:26:00.050156 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.050138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b76hd\" (UniqueName: \"kubernetes.io/projected/59ed8a36-517d-40fc-b340-cfe4a80582da-kube-api-access-b76hd\") pod \"kube-state-metrics-69db897b98-srrrb\" (UID: \"59ed8a36-517d-40fc-b340-cfe4a80582da\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.094646 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.094606 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:00.140117 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.140081 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.140316 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.140135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bxtb\" (UniqueName: \"kubernetes.io/projected/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-kube-api-access-8bxtb\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.140316 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.140190 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-tls\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.140316 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.140217 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-wtmp\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.140478 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.140337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-sys\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.140478 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.140369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-accelerators-collector-config\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.140478 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.140403 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-textfile\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.140622 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.140469 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-sys\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.140622 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.140524 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-root\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.140622 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.140554 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-metrics-client-ca\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.140622 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.140597 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-root\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.140823 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.140657 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-wtmp\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.140823 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.140690 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-textfile\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.141214 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.141191 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-metrics-client-ca\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.141311 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.141262 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-accelerators-collector-config\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.142793 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.142767 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.143034 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.143008 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-node-exporter-tls\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.144934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.144916 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" Apr 22 19:26:00.152241 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.152223 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bxtb\" (UniqueName: \"kubernetes.io/projected/9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad-kube-api-access-8bxtb\") pod \"node-exporter-pq9pg\" (UID: \"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad\") " pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.180306 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.180270 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pq9pg" Apr 22 19:26:00.469267 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:26:00.469226 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aa2bf52_4c93_4fd4_9f9b_78fa40aa6fad.slice/crio-0e236f8518d60353db5d83e903cb8fd4579db436cc0fb95b3f921fd8af8db81c WatchSource:0}: Error finding container 0e236f8518d60353db5d83e903cb8fd4579db436cc0fb95b3f921fd8af8db81c: Status 404 returned error can't find the container with id 0e236f8518d60353db5d83e903cb8fd4579db436cc0fb95b3f921fd8af8db81c Apr 22 19:26:00.544245 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.544176 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-mqnsf\" (UID: \"a85b69a4-76cb-4a0b-aea8-8638b5db8f71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:26:00.547398 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.547359 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a85b69a4-76cb-4a0b-aea8-8638b5db8f71-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-mqnsf\" (UID: \"a85b69a4-76cb-4a0b-aea8-8638b5db8f71\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:26:00.613665 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.613642 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-srrrb"] Apr 22 19:26:00.619895 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:26:00.619860 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59ed8a36_517d_40fc_b340_cfe4a80582da.slice/crio-e65f2033b5e9461917c25ea0adb53ee97ef128a628e95d2013b47dbbb13e63e3 WatchSource:0}: Error finding container e65f2033b5e9461917c25ea0adb53ee97ef128a628e95d2013b47dbbb13e63e3: Status 404 returned error can't find the container with id e65f2033b5e9461917c25ea0adb53ee97ef128a628e95d2013b47dbbb13e63e3 Apr 22 19:26:00.655945 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.655878 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bc8bbd8b5-x7lg8"] Apr 22 19:26:00.659747 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:26:00.659702 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4f56637_3af9_4406_911b_fb8449f565c5.slice/crio-aaa02d16db3f1ddb3224eae51a4168112341001a0e23966990f7725b4081d441 WatchSource:0}: Error finding container aaa02d16db3f1ddb3224eae51a4168112341001a0e23966990f7725b4081d441: Status 404 returned error can't find the container with id aaa02d16db3f1ddb3224eae51a4168112341001a0e23966990f7725b4081d441 Apr 22 19:26:00.720832 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.720706 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" Apr 22 19:26:00.858383 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.858342 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf"] Apr 22 19:26:00.862874 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:26:00.862831 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda85b69a4_76cb_4a0b_aea8_8638b5db8f71.slice/crio-47ebc8983f044594610857bbc015c57fb37a28f24cf149ac1a26009cf7d56b35 WatchSource:0}: Error finding container 47ebc8983f044594610857bbc015c57fb37a28f24cf149ac1a26009cf7d56b35: Status 404 returned error can't find the container with id 47ebc8983f044594610857bbc015c57fb37a28f24cf149ac1a26009cf7d56b35 Apr 22 19:26:00.932401 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.932374 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:26:00.937073 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.937049 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:00.941756 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.941696 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 19:26:00.941918 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.941785 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-cfdjc\"" Apr 22 19:26:00.942920 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.942899 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 19:26:00.943090 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.943072 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 19:26:00.943173 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.943160 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 19:26:00.943244 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.943178 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 19:26:00.943287 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.943278 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 19:26:00.943372 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.943358 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 19:26:00.943372 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.943368 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 19:26:00.943778 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.943647 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 19:26:00.960692 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:00.960642 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:26:01.048986 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.048947 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-config-volume\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.048986 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.048980 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.049213 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.049050 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-web-config\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.049213 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.049089 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.049213 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.049111 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/88d1700f-d459-4331-bde4-ac06f9d8e522-config-out\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.049213 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.049155 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.049213 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.049212 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.049460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.049230 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.049460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.049250 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.049460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.049266 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xzpn\" (UniqueName: \"kubernetes.io/projected/88d1700f-d459-4331-bde4-ac06f9d8e522-kube-api-access-6xzpn\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.049460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.049333 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/88d1700f-d459-4331-bde4-ac06f9d8e522-tls-assets\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.049460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.049372 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.049460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.049387 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.150962 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.150914 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.151128 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.150970 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.151128 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.151020 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-config-volume\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.151128 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.151046 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.151128 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.151077 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-web-config\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.151270 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.151171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.151270 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.151202 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/88d1700f-d459-4331-bde4-ac06f9d8e522-config-out\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.151270 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.151240 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.151361 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.151319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.151361 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.151348 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.151441 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.151376 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.151441 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.151400 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xzpn\" (UniqueName: \"kubernetes.io/projected/88d1700f-d459-4331-bde4-ac06f9d8e522-kube-api-access-6xzpn\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.151551 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.151459 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/88d1700f-d459-4331-bde4-ac06f9d8e522-tls-assets\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.153986 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:26:01.152150 2579 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 19:26:01.153986 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:26:01.152225 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-main-tls podName:88d1700f-d459-4331-bde4-ac06f9d8e522 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:01.652203717 +0000 UTC m=+164.457045004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522") : secret "alertmanager-main-tls" not found Apr 22 19:26:01.153986 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.153077 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.153986 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.153225 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.153986 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:26:01.153356 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-trusted-ca-bundle podName:88d1700f-d459-4331-bde4-ac06f9d8e522 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:01.653338882 +0000 UTC m=+164.458180169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522") : configmap references non-existent config key: ca-bundle.crt Apr 22 19:26:01.156937 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.156872 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/88d1700f-d459-4331-bde4-ac06f9d8e522-tls-assets\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.157074 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.157031 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.158140 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.158117 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/88d1700f-d459-4331-bde4-ac06f9d8e522-config-out\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.158925 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.158882 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.160126 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.159619 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-web-config\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.160126 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.160070 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.160126 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.160093 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-config-volume\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.163444 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.163398 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xzpn\" (UniqueName: \"kubernetes.io/projected/88d1700f-d459-4331-bde4-ac06f9d8e522-kube-api-access-6xzpn\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.164342 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.164316 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.267270 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.267155 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nrq92" event={"ID":"4cbce8ae-11e1-44fb-a76c-a617c14a01cb","Type":"ContainerStarted","Data":"aa2d2ec6532ede6c41fe317fac251b6d123918c64e1e0b31333bda2dae89dda1"} Apr 22 19:26:01.267270 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.267197 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nrq92" event={"ID":"4cbce8ae-11e1-44fb-a76c-a617c14a01cb","Type":"ContainerStarted","Data":"4790098f667e3c40fef323b1ba24d84c7ecefe6ac2274178c4df16384e188f04"} Apr 22 19:26:01.268453 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.268251 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-nrq92" Apr 22 19:26:01.270821 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.270179 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b42mv" event={"ID":"db70a090-7023-443d-b909-09cc5a489c13","Type":"ContainerStarted","Data":"8a9afdd284fca3b62f0c375ed3da98ef783d62935a9061f85b398ff01703f2e1"} Apr 22 19:26:01.273362 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.273270 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" event={"ID":"a85b69a4-76cb-4a0b-aea8-8638b5db8f71","Type":"ContainerStarted","Data":"65c3903dcc72fdeaedcfe4680297ebeb991cd4a94cdf29e392297a81e6f8e345"} Apr 22 19:26:01.273362 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.273302 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" event={"ID":"a85b69a4-76cb-4a0b-aea8-8638b5db8f71","Type":"ContainerStarted","Data":"11e2d4c1cd3c0e2b2f05c4ae2a629e35a6d71daaff037914bc1071f7151739c5"} Apr 22 19:26:01.273362 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.273315 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" event={"ID":"a85b69a4-76cb-4a0b-aea8-8638b5db8f71","Type":"ContainerStarted","Data":"47ebc8983f044594610857bbc015c57fb37a28f24cf149ac1a26009cf7d56b35"} Apr 22 19:26:01.276930 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.276876 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bc8bbd8b5-x7lg8" event={"ID":"c4f56637-3af9-4406-911b-fb8449f565c5","Type":"ContainerStarted","Data":"aaa02d16db3f1ddb3224eae51a4168112341001a0e23966990f7725b4081d441"} Apr 22 19:26:01.278449 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.278422 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" event={"ID":"59ed8a36-517d-40fc-b340-cfe4a80582da","Type":"ContainerStarted","Data":"e65f2033b5e9461917c25ea0adb53ee97ef128a628e95d2013b47dbbb13e63e3"} Apr 22 19:26:01.279872 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.279827 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pq9pg" event={"ID":"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad","Type":"ContainerStarted","Data":"0e236f8518d60353db5d83e903cb8fd4579db436cc0fb95b3f921fd8af8db81c"} Apr 22 19:26:01.296341 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.296281 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nrq92" podStartSLOduration=129.431277903 podStartE2EDuration="2m11.296257524s" podCreationTimestamp="2026-04-22 19:23:50 +0000 UTC" firstStartedPulling="2026-04-22 19:25:58.603195279 +0000 UTC m=+161.408036560" lastFinishedPulling="2026-04-22 19:26:00.468174883 +0000 UTC m=+163.273016181" observedRunningTime="2026-04-22 19:26:01.293196074 +0000 UTC m=+164.098037409" watchObservedRunningTime="2026-04-22 19:26:01.296257524 +0000 UTC m=+164.101098829" Apr 22 19:26:01.311513 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.311151 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b42mv" podStartSLOduration=129.426362622 podStartE2EDuration="2m11.311129731s" podCreationTimestamp="2026-04-22 19:23:50 +0000 UTC" firstStartedPulling="2026-04-22 19:25:58.589627473 +0000 UTC m=+161.394468754" lastFinishedPulling="2026-04-22 19:26:00.474394567 +0000 UTC m=+163.279235863" observedRunningTime="2026-04-22 19:26:01.309707778 +0000 UTC m=+164.114549092" watchObservedRunningTime="2026-04-22 19:26:01.311129731 +0000 UTC m=+164.115971037" Apr 22 19:26:01.657758 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.657170 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.657758 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.657285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.658868 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.658809 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.661851 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.661796 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:01.852120 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:01.851581 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:26:02.284493 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:02.284402 2579 generic.go:358] "Generic (PLEG): container finished" podID="9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad" containerID="01c87bb5a58190392c38814fa5e8138f41d1b1d14ea9e719367e23dc3ace8fe6" exitCode=0 Apr 22 19:26:02.285023 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:02.284545 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pq9pg" event={"ID":"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad","Type":"ContainerDied","Data":"01c87bb5a58190392c38814fa5e8138f41d1b1d14ea9e719367e23dc3ace8fe6"} Apr 22 19:26:02.486425 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:02.486229 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:26:02.490796 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:26:02.490717 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88d1700f_d459_4331_bde4_ac06f9d8e522.slice/crio-4aa53cbe4b3e180e679998852f396b59b4bba685b2739ff41aa50298e2345d8b WatchSource:0}: Error finding container 4aa53cbe4b3e180e679998852f396b59b4bba685b2739ff41aa50298e2345d8b: Status 404 returned error can't find the container with id 4aa53cbe4b3e180e679998852f396b59b4bba685b2739ff41aa50298e2345d8b Apr 22 19:26:03.290384 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:03.290349 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pq9pg" event={"ID":"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad","Type":"ContainerStarted","Data":"1dabaaa1d73ddc5089c806dab9b20f19f39958507a41ec33497e09041420333b"} Apr 22 19:26:03.290384 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:03.290390 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pq9pg" event={"ID":"9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad","Type":"ContainerStarted","Data":"220db4bb852f641ac494ea0fa80a785c9a992f36e116cee6d5c67a058b98b193"} Apr 22 19:26:03.291744 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:03.291704 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerStarted","Data":"4aa53cbe4b3e180e679998852f396b59b4bba685b2739ff41aa50298e2345d8b"} Apr 22 19:26:03.293776 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:03.293739 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" event={"ID":"a85b69a4-76cb-4a0b-aea8-8638b5db8f71","Type":"ContainerStarted","Data":"cfb40588b2200ccd1d63ec46af01ab4ffcf2f77c8c082f7125591771010aad6a"} Apr 22 19:26:03.295848 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:03.295817 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" event={"ID":"59ed8a36-517d-40fc-b340-cfe4a80582da","Type":"ContainerStarted","Data":"c0f7e7edff6f06d5f84d6484ac6ae1c5d303c91cf3bb8370691edfb929a05b39"} Apr 22 19:26:03.295956 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:03.295857 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" event={"ID":"59ed8a36-517d-40fc-b340-cfe4a80582da","Type":"ContainerStarted","Data":"fb3bc18311f4dbcc3a7c61fab2de3a6cc6068a789cbdb46a6533080aa8692fad"} Apr 22 19:26:03.295956 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:03.295874 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" event={"ID":"59ed8a36-517d-40fc-b340-cfe4a80582da","Type":"ContainerStarted","Data":"d8206a15f1f88ece4bcae2b3f75720837ac9778263c4493569b491860400054c"} Apr 22 19:26:03.312824 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:03.312757 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pq9pg" podStartSLOduration=3.526105063 podStartE2EDuration="4.312722557s" podCreationTimestamp="2026-04-22 19:25:59 +0000 UTC" firstStartedPulling="2026-04-22 19:26:00.472990448 +0000 UTC m=+163.277831732" lastFinishedPulling="2026-04-22 19:26:01.25960794 +0000 UTC m=+164.064449226" observedRunningTime="2026-04-22 19:26:03.31045336 +0000 UTC m=+166.115294700" watchObservedRunningTime="2026-04-22 19:26:03.312722557 +0000 UTC m=+166.117563861" Apr 22 19:26:03.328276 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:03.328217 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mqnsf" podStartSLOduration=3.016278986 podStartE2EDuration="4.328196892s" podCreationTimestamp="2026-04-22 19:25:59 +0000 UTC" firstStartedPulling="2026-04-22 19:26:01.009259379 +0000 UTC m=+163.814100664" lastFinishedPulling="2026-04-22 19:26:02.321177281 +0000 UTC m=+165.126018570" observedRunningTime="2026-04-22 19:26:03.327934613 +0000 UTC m=+166.132775930" watchObservedRunningTime="2026-04-22 19:26:03.328196892 +0000 UTC m=+166.133038195" Apr 22 19:26:03.347297 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:03.346902 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-srrrb" podStartSLOduration=2.650373735 podStartE2EDuration="4.3468792s" podCreationTimestamp="2026-04-22 19:25:59 +0000 UTC" firstStartedPulling="2026-04-22 19:26:00.622579093 +0000 UTC m=+163.427420376" lastFinishedPulling="2026-04-22 19:26:02.319084556 +0000 UTC m=+165.123925841" observedRunningTime="2026-04-22 19:26:03.345425622 +0000 UTC m=+166.150266920" watchObservedRunningTime="2026-04-22 19:26:03.3468792 +0000 UTC m=+166.151720507" Apr 22 19:26:04.303916 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:04.303876 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bc8bbd8b5-x7lg8" event={"ID":"c4f56637-3af9-4406-911b-fb8449f565c5","Type":"ContainerStarted","Data":"bc7f6ace84017ead601c41b240c22220e75d03e45b5a98df8d06833977dbc3cc"} Apr 22 19:26:04.342867 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:04.342815 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bc8bbd8b5-x7lg8" podStartSLOduration=2.221377423 podStartE2EDuration="5.34279832s" podCreationTimestamp="2026-04-22 19:25:59 +0000 UTC" firstStartedPulling="2026-04-22 19:26:00.662608708 +0000 UTC m=+163.467449992" lastFinishedPulling="2026-04-22 19:26:03.784029395 +0000 UTC m=+166.588870889" observedRunningTime="2026-04-22 19:26:04.340865796 +0000 UTC m=+167.145707103" watchObservedRunningTime="2026-04-22 19:26:04.34279832 +0000 UTC m=+167.147639623" Apr 22 19:26:04.800772 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:04.800696 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:26:05.307376 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:05.307340 2579 generic.go:358] "Generic (PLEG): container finished" podID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerID="b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d" exitCode=0 Apr 22 19:26:05.307837 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:05.307438 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerDied","Data":"b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d"} Apr 22 19:26:08.319620 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:08.319580 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerStarted","Data":"7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc"} Apr 22 19:26:08.320100 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:08.319626 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerStarted","Data":"7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34"} Apr 22 19:26:08.320100 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:08.319644 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerStarted","Data":"a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523"} Apr 22 19:26:08.320100 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:08.319660 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerStarted","Data":"f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472"} Apr 22 19:26:08.320100 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:08.319674 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerStarted","Data":"6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a"} Apr 22 19:26:09.325080 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:09.325045 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerStarted","Data":"bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583"} Apr 22 19:26:09.362084 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:09.362034 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.442441058 podStartE2EDuration="9.362020718s" podCreationTimestamp="2026-04-22 19:26:00 +0000 UTC" firstStartedPulling="2026-04-22 19:26:02.493128471 +0000 UTC m=+165.297969753" lastFinishedPulling="2026-04-22 19:26:08.412708128 +0000 UTC m=+171.217549413" observedRunningTime="2026-04-22 19:26:09.360661873 +0000 UTC m=+172.165503190" watchObservedRunningTime="2026-04-22 19:26:09.362020718 +0000 UTC m=+172.166862022" Apr 22 19:26:10.095546 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:10.095509 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:10.095546 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:10.095556 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:10.100429 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:10.100408 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:10.332169 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:10.332142 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:11.286670 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:11.286638 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nrq92" Apr 22 19:26:14.243638 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:14.243607 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-c9898b8d8-4dftw" Apr 22 19:26:20.401568 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:20.401521 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bc8bbd8b5-x7lg8"] Apr 22 19:26:37.408038 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:37.408003 2579 generic.go:358] "Generic (PLEG): container finished" podID="c6aa8798-c003-4836-b89b-2ec659893918" containerID="677ffafee87eb2cdce13cbc7d1ac9f42fd96d2411a88b6567d47f2b7c403476b" exitCode=0 Apr 22 19:26:37.408442 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:37.408081 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-g5jd8" event={"ID":"c6aa8798-c003-4836-b89b-2ec659893918","Type":"ContainerDied","Data":"677ffafee87eb2cdce13cbc7d1ac9f42fd96d2411a88b6567d47f2b7c403476b"} Apr 22 19:26:37.408442 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:37.408401 2579 scope.go:117] "RemoveContainer" containerID="677ffafee87eb2cdce13cbc7d1ac9f42fd96d2411a88b6567d47f2b7c403476b" Apr 22 19:26:38.412043 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:38.412003 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-g5jd8" event={"ID":"c6aa8798-c003-4836-b89b-2ec659893918","Type":"ContainerStarted","Data":"0efa22a42a22a0f57a7fcdc9a4611d9d8dbdd06ce2c49f9149e680cabfdc3137"} Apr 22 19:26:45.421374 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.421306 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7bc8bbd8b5-x7lg8" podUID="c4f56637-3af9-4406-911b-fb8449f565c5" containerName="console" containerID="cri-o://bc7f6ace84017ead601c41b240c22220e75d03e45b5a98df8d06833977dbc3cc" gracePeriod=15 Apr 22 19:26:45.663257 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.663234 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bc8bbd8b5-x7lg8_c4f56637-3af9-4406-911b-fb8449f565c5/console/0.log" Apr 22 19:26:45.663381 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.663302 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:45.741771 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.741674 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4f56637-3af9-4406-911b-fb8449f565c5-console-oauth-config\") pod \"c4f56637-3af9-4406-911b-fb8449f565c5\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " Apr 22 19:26:45.741921 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.741795 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6bnh\" (UniqueName: \"kubernetes.io/projected/c4f56637-3af9-4406-911b-fb8449f565c5-kube-api-access-v6bnh\") pod \"c4f56637-3af9-4406-911b-fb8449f565c5\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " Apr 22 19:26:45.741921 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.741816 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-service-ca\") pod \"c4f56637-3af9-4406-911b-fb8449f565c5\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " Apr 22 19:26:45.741921 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.741839 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-oauth-serving-cert\") pod \"c4f56637-3af9-4406-911b-fb8449f565c5\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " Apr 22 19:26:45.741921 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.741860 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-console-config\") pod \"c4f56637-3af9-4406-911b-fb8449f565c5\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " Apr 22 19:26:45.741921 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.741878 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f56637-3af9-4406-911b-fb8449f565c5-console-serving-cert\") pod \"c4f56637-3af9-4406-911b-fb8449f565c5\" (UID: \"c4f56637-3af9-4406-911b-fb8449f565c5\") " Apr 22 19:26:45.742250 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.742220 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-service-ca" (OuterVolumeSpecName: "service-ca") pod "c4f56637-3af9-4406-911b-fb8449f565c5" (UID: "c4f56637-3af9-4406-911b-fb8449f565c5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:45.742396 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.742245 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c4f56637-3af9-4406-911b-fb8449f565c5" (UID: "c4f56637-3af9-4406-911b-fb8449f565c5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:45.742396 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.742283 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-console-config" (OuterVolumeSpecName: "console-config") pod "c4f56637-3af9-4406-911b-fb8449f565c5" (UID: "c4f56637-3af9-4406-911b-fb8449f565c5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:45.744101 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.744069 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4f56637-3af9-4406-911b-fb8449f565c5-kube-api-access-v6bnh" (OuterVolumeSpecName: "kube-api-access-v6bnh") pod "c4f56637-3af9-4406-911b-fb8449f565c5" (UID: "c4f56637-3af9-4406-911b-fb8449f565c5"). InnerVolumeSpecName "kube-api-access-v6bnh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:45.744101 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.744083 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4f56637-3af9-4406-911b-fb8449f565c5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c4f56637-3af9-4406-911b-fb8449f565c5" (UID: "c4f56637-3af9-4406-911b-fb8449f565c5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:45.744218 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.744124 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4f56637-3af9-4406-911b-fb8449f565c5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c4f56637-3af9-4406-911b-fb8449f565c5" (UID: "c4f56637-3af9-4406-911b-fb8449f565c5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:45.842594 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.842559 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4f56637-3af9-4406-911b-fb8449f565c5-console-oauth-config\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:26:45.842594 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.842590 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v6bnh\" (UniqueName: \"kubernetes.io/projected/c4f56637-3af9-4406-911b-fb8449f565c5-kube-api-access-v6bnh\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:26:45.842594 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.842602 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-service-ca\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:26:45.842862 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.842611 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-oauth-serving-cert\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:26:45.842862 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.842621 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4f56637-3af9-4406-911b-fb8449f565c5-console-config\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:26:45.842862 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:45.842630 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f56637-3af9-4406-911b-fb8449f565c5-console-serving-cert\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:26:46.434292 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:46.434266 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bc8bbd8b5-x7lg8_c4f56637-3af9-4406-911b-fb8449f565c5/console/0.log" Apr 22 19:26:46.434776 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:46.434306 2579 generic.go:358] "Generic (PLEG): container finished" podID="c4f56637-3af9-4406-911b-fb8449f565c5" containerID="bc7f6ace84017ead601c41b240c22220e75d03e45b5a98df8d06833977dbc3cc" exitCode=2 Apr 22 19:26:46.434776 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:46.434382 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bc8bbd8b5-x7lg8" event={"ID":"c4f56637-3af9-4406-911b-fb8449f565c5","Type":"ContainerDied","Data":"bc7f6ace84017ead601c41b240c22220e75d03e45b5a98df8d06833977dbc3cc"} Apr 22 19:26:46.434776 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:46.434389 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bc8bbd8b5-x7lg8" Apr 22 19:26:46.434776 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:46.434407 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bc8bbd8b5-x7lg8" event={"ID":"c4f56637-3af9-4406-911b-fb8449f565c5","Type":"ContainerDied","Data":"aaa02d16db3f1ddb3224eae51a4168112341001a0e23966990f7725b4081d441"} Apr 22 19:26:46.434776 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:46.434423 2579 scope.go:117] "RemoveContainer" containerID="bc7f6ace84017ead601c41b240c22220e75d03e45b5a98df8d06833977dbc3cc" Apr 22 19:26:46.442357 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:46.442340 2579 scope.go:117] "RemoveContainer" containerID="bc7f6ace84017ead601c41b240c22220e75d03e45b5a98df8d06833977dbc3cc" Apr 22 19:26:46.442614 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:26:46.442593 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc7f6ace84017ead601c41b240c22220e75d03e45b5a98df8d06833977dbc3cc\": container with ID starting with bc7f6ace84017ead601c41b240c22220e75d03e45b5a98df8d06833977dbc3cc not found: ID does not exist" containerID="bc7f6ace84017ead601c41b240c22220e75d03e45b5a98df8d06833977dbc3cc" Apr 22 19:26:46.442688 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:46.442628 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7f6ace84017ead601c41b240c22220e75d03e45b5a98df8d06833977dbc3cc"} err="failed to get container status \"bc7f6ace84017ead601c41b240c22220e75d03e45b5a98df8d06833977dbc3cc\": rpc error: code = NotFound desc = could not find container \"bc7f6ace84017ead601c41b240c22220e75d03e45b5a98df8d06833977dbc3cc\": container with ID starting with bc7f6ace84017ead601c41b240c22220e75d03e45b5a98df8d06833977dbc3cc not found: ID does not exist" Apr 22 19:26:46.457785 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:46.457754 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bc8bbd8b5-x7lg8"] Apr 22 19:26:46.461396 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:46.461373 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7bc8bbd8b5-x7lg8"] Apr 22 19:26:47.804859 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:47.804826 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4f56637-3af9-4406-911b-fb8449f565c5" path="/var/lib/kubelet/pods/c4f56637-3af9-4406-911b-fb8449f565c5/volumes" Apr 22 19:26:57.468710 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:57.468675 2579 generic.go:358] "Generic (PLEG): container finished" podID="e672d5c5-46b6-4d03-88de-c5f1dd4735ad" containerID="4807b969d879dd0e8102659fcb265bdfea7f066f5273f1fee670be427a891c83" exitCode=0 Apr 22 19:26:57.469102 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:57.468759 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" event={"ID":"e672d5c5-46b6-4d03-88de-c5f1dd4735ad","Type":"ContainerDied","Data":"4807b969d879dd0e8102659fcb265bdfea7f066f5273f1fee670be427a891c83"} Apr 22 19:26:57.469143 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:57.469107 2579 scope.go:117] "RemoveContainer" containerID="4807b969d879dd0e8102659fcb265bdfea7f066f5273f1fee670be427a891c83" Apr 22 19:26:58.473563 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:26:58.473525 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8znjq" event={"ID":"e672d5c5-46b6-4d03-88de-c5f1dd4735ad","Type":"ContainerStarted","Data":"4108c668616d690f9a683f02bba5de25f52ffd72aac9c5e703ac4fb179cb7d65"} Apr 22 19:27:20.137296 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.137258 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:27:20.137758 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.137672 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="alertmanager" containerID="cri-o://6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a" gracePeriod=120 Apr 22 19:27:20.137832 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.137764 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="kube-rbac-proxy-metric" containerID="cri-o://7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc" gracePeriod=120 Apr 22 19:27:20.137892 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.137807 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="kube-rbac-proxy-web" containerID="cri-o://a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523" gracePeriod=120 Apr 22 19:27:20.137892 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.137851 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="kube-rbac-proxy" containerID="cri-o://7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34" gracePeriod=120 Apr 22 19:27:20.137991 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.137867 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="config-reloader" containerID="cri-o://f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472" gracePeriod=120 Apr 22 19:27:20.137991 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.137806 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="prom-label-proxy" containerID="cri-o://bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583" gracePeriod=120 Apr 22 19:27:20.534252 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.534165 2579 generic.go:358] "Generic (PLEG): container finished" podID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerID="bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583" exitCode=0 Apr 22 19:27:20.534252 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.534188 2579 generic.go:358] "Generic (PLEG): container finished" podID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerID="7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34" exitCode=0 Apr 22 19:27:20.534252 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.534195 2579 generic.go:358] "Generic (PLEG): container finished" podID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerID="f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472" exitCode=0 Apr 22 19:27:20.534252 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.534201 2579 generic.go:358] "Generic (PLEG): container finished" podID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerID="6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a" exitCode=0 Apr 22 19:27:20.534252 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.534221 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerDied","Data":"bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583"} Apr 22 19:27:20.534252 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.534247 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerDied","Data":"7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34"} Apr 22 19:27:20.534252 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.534257 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerDied","Data":"f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472"} Apr 22 19:27:20.534564 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:20.534266 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerDied","Data":"6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a"} Apr 22 19:27:21.374595 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.374572 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.452977 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.452897 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy\") pod \"88d1700f-d459-4331-bde4-ac06f9d8e522\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " Apr 22 19:27:21.452977 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.452945 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xzpn\" (UniqueName: \"kubernetes.io/projected/88d1700f-d459-4331-bde4-ac06f9d8e522-kube-api-access-6xzpn\") pod \"88d1700f-d459-4331-bde4-ac06f9d8e522\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " Apr 22 19:27:21.453191 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.452985 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/88d1700f-d459-4331-bde4-ac06f9d8e522-tls-assets\") pod \"88d1700f-d459-4331-bde4-ac06f9d8e522\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " Apr 22 19:27:21.453191 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.453018 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/88d1700f-d459-4331-bde4-ac06f9d8e522-config-out\") pod \"88d1700f-d459-4331-bde4-ac06f9d8e522\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " Apr 22 19:27:21.453191 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.453061 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-main-tls\") pod \"88d1700f-d459-4331-bde4-ac06f9d8e522\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " Apr 22 19:27:21.453191 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.453112 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-web-config\") pod \"88d1700f-d459-4331-bde4-ac06f9d8e522\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " Apr 22 19:27:21.453191 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.453136 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-metrics-client-ca\") pod \"88d1700f-d459-4331-bde4-ac06f9d8e522\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " Apr 22 19:27:21.453191 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.453168 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-main-db\") pod \"88d1700f-d459-4331-bde4-ac06f9d8e522\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " Apr 22 19:27:21.453499 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.453209 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy-web\") pod \"88d1700f-d459-4331-bde4-ac06f9d8e522\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " Apr 22 19:27:21.453499 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.453236 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-trusted-ca-bundle\") pod \"88d1700f-d459-4331-bde4-ac06f9d8e522\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " Apr 22 19:27:21.453499 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.453282 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-cluster-tls-config\") pod \"88d1700f-d459-4331-bde4-ac06f9d8e522\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " Apr 22 19:27:21.453499 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.453320 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-config-volume\") pod \"88d1700f-d459-4331-bde4-ac06f9d8e522\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " Apr 22 19:27:21.453499 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.453347 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy-metric\") pod \"88d1700f-d459-4331-bde4-ac06f9d8e522\" (UID: \"88d1700f-d459-4331-bde4-ac06f9d8e522\") " Apr 22 19:27:21.453499 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.453481 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "88d1700f-d459-4331-bde4-ac06f9d8e522" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:21.453827 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.453657 2579 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-metrics-client-ca\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:27:21.453995 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.453969 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "88d1700f-d459-4331-bde4-ac06f9d8e522" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:21.454248 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.454225 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "88d1700f-d459-4331-bde4-ac06f9d8e522" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:27:21.456279 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.456063 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "88d1700f-d459-4331-bde4-ac06f9d8e522" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:21.456476 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.456443 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "88d1700f-d459-4331-bde4-ac06f9d8e522" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:21.456558 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.456473 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88d1700f-d459-4331-bde4-ac06f9d8e522-kube-api-access-6xzpn" (OuterVolumeSpecName: "kube-api-access-6xzpn") pod "88d1700f-d459-4331-bde4-ac06f9d8e522" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522"). InnerVolumeSpecName "kube-api-access-6xzpn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:27:21.456558 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.456469 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "88d1700f-d459-4331-bde4-ac06f9d8e522" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:21.456558 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.456496 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "88d1700f-d459-4331-bde4-ac06f9d8e522" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:21.456723 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.456672 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88d1700f-d459-4331-bde4-ac06f9d8e522-config-out" (OuterVolumeSpecName: "config-out") pod "88d1700f-d459-4331-bde4-ac06f9d8e522" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:27:21.456896 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.456856 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88d1700f-d459-4331-bde4-ac06f9d8e522-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "88d1700f-d459-4331-bde4-ac06f9d8e522" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:27:21.458115 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.458093 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-config-volume" (OuterVolumeSpecName: "config-volume") pod "88d1700f-d459-4331-bde4-ac06f9d8e522" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:21.461126 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.461104 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "88d1700f-d459-4331-bde4-ac06f9d8e522" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:21.466250 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.466226 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-web-config" (OuterVolumeSpecName: "web-config") pod "88d1700f-d459-4331-bde4-ac06f9d8e522" (UID: "88d1700f-d459-4331-bde4-ac06f9d8e522"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:21.539904 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.539874 2579 generic.go:358] "Generic (PLEG): container finished" podID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerID="7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc" exitCode=0 Apr 22 19:27:21.539904 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.539898 2579 generic.go:358] "Generic (PLEG): container finished" podID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerID="a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523" exitCode=0 Apr 22 19:27:21.540107 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.539952 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerDied","Data":"7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc"} Apr 22 19:27:21.540107 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.539977 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerDied","Data":"a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523"} Apr 22 19:27:21.540107 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.539988 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88d1700f-d459-4331-bde4-ac06f9d8e522","Type":"ContainerDied","Data":"4aa53cbe4b3e180e679998852f396b59b4bba685b2739ff41aa50298e2345d8b"} Apr 22 19:27:21.540107 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.539993 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.540107 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.540003 2579 scope.go:117] "RemoveContainer" containerID="bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583" Apr 22 19:27:21.548064 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.547904 2579 scope.go:117] "RemoveContainer" containerID="7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc" Apr 22 19:27:21.554286 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.554263 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:27:21.554286 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.554287 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6xzpn\" (UniqueName: \"kubernetes.io/projected/88d1700f-d459-4331-bde4-ac06f9d8e522-kube-api-access-6xzpn\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:27:21.554423 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.554298 2579 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/88d1700f-d459-4331-bde4-ac06f9d8e522-tls-assets\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:27:21.554423 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.554309 2579 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/88d1700f-d459-4331-bde4-ac06f9d8e522-config-out\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:27:21.554423 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.554318 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-main-tls\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:27:21.554423 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.554327 2579 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-web-config\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:27:21.554423 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.554337 2579 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-main-db\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:27:21.554423 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.554345 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:27:21.554423 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.554354 2579 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88d1700f-d459-4331-bde4-ac06f9d8e522-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:27:21.554423 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.554363 2579 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-cluster-tls-config\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:27:21.554423 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.554372 2579 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-config-volume\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:27:21.554423 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.554381 2579 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/88d1700f-d459-4331-bde4-ac06f9d8e522-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:27:21.555091 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.555078 2579 scope.go:117] "RemoveContainer" containerID="7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34" Apr 22 19:27:21.561432 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.561412 2579 scope.go:117] "RemoveContainer" containerID="a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523" Apr 22 19:27:21.563518 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.563499 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:27:21.568135 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.568111 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:27:21.568781 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.568764 2579 scope.go:117] "RemoveContainer" containerID="f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472" Apr 22 19:27:21.575047 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.575029 2579 scope.go:117] "RemoveContainer" containerID="6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a" Apr 22 19:27:21.581257 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.581241 2579 scope.go:117] "RemoveContainer" containerID="b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d" Apr 22 19:27:21.587460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.587445 2579 scope.go:117] "RemoveContainer" containerID="bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583" Apr 22 19:27:21.587831 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:27:21.587748 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583\": container with ID starting with bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583 not found: ID does not exist" containerID="bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583" Apr 22 19:27:21.587831 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.587777 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583"} err="failed to get container status \"bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583\": rpc error: code = NotFound desc = could not find container \"bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583\": container with ID starting with bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583 not found: ID does not exist" Apr 22 19:27:21.587831 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.587796 2579 scope.go:117] "RemoveContainer" containerID="7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc" Apr 22 19:27:21.588054 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:27:21.588038 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc\": container with ID starting with 7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc not found: ID does not exist" containerID="7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc" Apr 22 19:27:21.588133 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.588062 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc"} err="failed to get container status \"7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc\": rpc error: code = NotFound desc = could not find container \"7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc\": container with ID starting with 7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc not found: ID does not exist" Apr 22 19:27:21.588133 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.588077 2579 scope.go:117] "RemoveContainer" containerID="7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34" Apr 22 19:27:21.588291 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:27:21.588276 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34\": container with ID starting with 7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34 not found: ID does not exist" containerID="7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34" Apr 22 19:27:21.588326 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.588294 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34"} err="failed to get container status \"7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34\": rpc error: code = NotFound desc = could not find container \"7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34\": container with ID starting with 7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34 not found: ID does not exist" Apr 22 19:27:21.588326 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.588305 2579 scope.go:117] "RemoveContainer" containerID="a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523" Apr 22 19:27:21.588489 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:27:21.588476 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523\": container with ID starting with a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523 not found: ID does not exist" containerID="a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523" Apr 22 19:27:21.588528 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.588490 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523"} err="failed to get container status \"a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523\": rpc error: code = NotFound desc = could not find container \"a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523\": container with ID starting with a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523 not found: ID does not exist" Apr 22 19:27:21.588528 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.588502 2579 scope.go:117] "RemoveContainer" containerID="f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472" Apr 22 19:27:21.588686 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:27:21.588672 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472\": container with ID starting with f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472 not found: ID does not exist" containerID="f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472" Apr 22 19:27:21.588748 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.588688 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472"} err="failed to get container status \"f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472\": rpc error: code = NotFound desc = could not find container \"f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472\": container with ID starting with f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472 not found: ID does not exist" Apr 22 19:27:21.588748 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.588699 2579 scope.go:117] "RemoveContainer" containerID="6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a" Apr 22 19:27:21.588942 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:27:21.588924 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a\": container with ID starting with 6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a not found: ID does not exist" containerID="6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a" Apr 22 19:27:21.588985 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.588946 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a"} err="failed to get container status \"6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a\": rpc error: code = NotFound desc = could not find container \"6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a\": container with ID starting with 6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a not found: ID does not exist" Apr 22 19:27:21.588985 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.588960 2579 scope.go:117] "RemoveContainer" containerID="b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d" Apr 22 19:27:21.589166 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:27:21.589149 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d\": container with ID starting with b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d not found: ID does not exist" containerID="b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d" Apr 22 19:27:21.589230 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.589176 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d"} err="failed to get container status \"b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d\": rpc error: code = NotFound desc = could not find container \"b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d\": container with ID starting with b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d not found: ID does not exist" Apr 22 19:27:21.589230 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.589199 2579 scope.go:117] "RemoveContainer" containerID="bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583" Apr 22 19:27:21.589431 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.589410 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583"} err="failed to get container status \"bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583\": rpc error: code = NotFound desc = could not find container \"bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583\": container with ID starting with bd9368031db9b52ea2ca2146c8ceba152b5747c71a8847d4b7187146a4c7d583 not found: ID does not exist" Apr 22 19:27:21.589478 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.589431 2579 scope.go:117] "RemoveContainer" containerID="7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc" Apr 22 19:27:21.589620 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.589602 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc"} err="failed to get container status \"7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc\": rpc error: code = NotFound desc = could not find container \"7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc\": container with ID starting with 7021fea5dcecf41c11e164eccfc79f570babb815d6f152e8d09b02427cedfffc not found: ID does not exist" Apr 22 19:27:21.589660 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.589621 2579 scope.go:117] "RemoveContainer" containerID="7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34" Apr 22 19:27:21.589809 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.589790 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34"} err="failed to get container status \"7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34\": rpc error: code = NotFound desc = could not find container \"7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34\": container with ID starting with 7c576d50cac6f6ee000da5fb184409825089e47d3bd02a8a0da6f9d22f5edf34 not found: ID does not exist" Apr 22 19:27:21.589858 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.589811 2579 scope.go:117] "RemoveContainer" containerID="a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523" Apr 22 19:27:21.590015 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.589995 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523"} err="failed to get container status \"a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523\": rpc error: code = NotFound desc = could not find container \"a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523\": container with ID starting with a166a5fcf3f665d42651d3dc1b27bab079b4f05d30794688d559573ff6d48523 not found: ID does not exist" Apr 22 19:27:21.590052 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.590016 2579 scope.go:117] "RemoveContainer" containerID="f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472" Apr 22 19:27:21.590241 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.590224 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472"} err="failed to get container status \"f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472\": rpc error: code = NotFound desc = could not find container \"f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472\": container with ID starting with f3ae14687adbe1ca6ddcc65a08080f7936d7e85b9d2e3616a8ddb818a50e5472 not found: ID does not exist" Apr 22 19:27:21.590301 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.590241 2579 scope.go:117] "RemoveContainer" containerID="6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a" Apr 22 19:27:21.590459 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.590430 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a"} err="failed to get container status \"6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a\": rpc error: code = NotFound desc = could not find container \"6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a\": container with ID starting with 6c1009567623927a9cb8eee1e7cd47fec87ccea82af534e1166ea2adeafc745a not found: ID does not exist" Apr 22 19:27:21.590459 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.590451 2579 scope.go:117] "RemoveContainer" containerID="b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d" Apr 22 19:27:21.590645 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.590628 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d"} err="failed to get container status \"b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d\": rpc error: code = NotFound desc = could not find container \"b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d\": container with ID starting with b623dc79ad58b2adfd24c2a66b02b7b9a77be43be30da84cc95443fde475b72d not found: ID does not exist" Apr 22 19:27:21.595568 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595547 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:27:21.595824 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595813 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4f56637-3af9-4406-911b-fb8449f565c5" containerName="console" Apr 22 19:27:21.595875 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595826 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4f56637-3af9-4406-911b-fb8449f565c5" containerName="console" Apr 22 19:27:21.595875 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595836 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="kube-rbac-proxy" Apr 22 19:27:21.595875 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595842 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="kube-rbac-proxy" Apr 22 19:27:21.595875 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595852 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="alertmanager" Apr 22 19:27:21.595875 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595857 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="alertmanager" Apr 22 19:27:21.595875 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595865 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="prom-label-proxy" Apr 22 19:27:21.595875 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595871 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="prom-label-proxy" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595881 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="init-config-reloader" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595889 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="init-config-reloader" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595903 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="config-reloader" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595910 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="config-reloader" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595916 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="kube-rbac-proxy-web" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595922 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="kube-rbac-proxy-web" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595928 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="kube-rbac-proxy-metric" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595933 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="kube-rbac-proxy-metric" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595982 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="kube-rbac-proxy-web" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595990 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="alertmanager" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.595996 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="kube-rbac-proxy-metric" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.596001 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="prom-label-proxy" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.596007 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="config-reloader" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.596012 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" containerName="kube-rbac-proxy" Apr 22 19:27:21.596076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.596019 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4f56637-3af9-4406-911b-fb8449f565c5" containerName="console" Apr 22 19:27:21.601053 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.601035 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.604118 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.604094 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 19:27:21.604247 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.604141 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 19:27:21.604247 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.604152 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 19:27:21.604247 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.604152 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 19:27:21.604598 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.604393 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 19:27:21.604598 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.604500 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 19:27:21.604598 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.604531 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 19:27:21.604598 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.604501 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-cfdjc\"" Apr 22 19:27:21.604823 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.604812 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 19:27:21.608807 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.608788 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 19:27:21.613911 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.613889 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:27:21.654942 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.654907 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.654942 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.654942 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.655152 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.654964 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.655152 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.655057 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.655152 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.655108 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-config-out\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.655152 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.655127 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.655356 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.655193 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.655356 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.655250 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-web-config\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.655356 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.655280 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.655356 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.655344 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.655534 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.655375 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-config-volume\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.655534 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.655402 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nzkh\" (UniqueName: \"kubernetes.io/projected/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-kube-api-access-5nzkh\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.655534 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.655438 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.756800 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.756696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-web-config\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.756800 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.756754 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.756800 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.756776 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.757056 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.756881 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-config-volume\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.757056 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.756913 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nzkh\" (UniqueName: \"kubernetes.io/projected/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-kube-api-access-5nzkh\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.757056 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.756930 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.757056 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.756951 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.757056 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.756974 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.757056 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.757007 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.757381 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.757071 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.757381 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.757117 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-config-out\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.757381 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.757144 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.757381 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.757178 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.757381 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.757305 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.758112 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.758082 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.758788 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.758761 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.759965 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.759908 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-config-volume\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.760062 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.759987 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-web-config\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.760207 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.760177 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-config-out\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.760330 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.760305 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.760393 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.760311 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.760452 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.760407 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.760452 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.760430 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.761095 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.761075 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.761831 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.761813 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.765841 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.765821 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nzkh\" (UniqueName: \"kubernetes.io/projected/ad0a23d8-c6df-4985-ae22-96cdc1d30ef5-kube-api-access-5nzkh\") pod \"alertmanager-main-0\" (UID: \"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:21.804390 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.804360 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88d1700f-d459-4331-bde4-ac06f9d8e522" path="/var/lib/kubelet/pods/88d1700f-d459-4331-bde4-ac06f9d8e522/volumes" Apr 22 19:27:21.910102 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:21.910063 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:22.036266 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:22.036191 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:27:22.039441 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:27:22.039408 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad0a23d8_c6df_4985_ae22_96cdc1d30ef5.slice/crio-186e22f17dca7add625c298f5f5b83299a8caf524ea7c6cf85775997011f445d WatchSource:0}: Error finding container 186e22f17dca7add625c298f5f5b83299a8caf524ea7c6cf85775997011f445d: Status 404 returned error can't find the container with id 186e22f17dca7add625c298f5f5b83299a8caf524ea7c6cf85775997011f445d Apr 22 19:27:22.544205 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:22.544168 2579 generic.go:358] "Generic (PLEG): container finished" podID="ad0a23d8-c6df-4985-ae22-96cdc1d30ef5" containerID="b90afd31bd88e71df33007f8d26ea71b43b9ee10cc81ef4090c89ca30abb9262" exitCode=0 Apr 22 19:27:22.544608 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:22.544256 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5","Type":"ContainerDied","Data":"b90afd31bd88e71df33007f8d26ea71b43b9ee10cc81ef4090c89ca30abb9262"} Apr 22 19:27:22.544608 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:22.544306 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5","Type":"ContainerStarted","Data":"186e22f17dca7add625c298f5f5b83299a8caf524ea7c6cf85775997011f445d"} Apr 22 19:27:23.550839 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:23.550800 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5","Type":"ContainerStarted","Data":"14a313e81056a89fd546cff969f1914192d5c55ac86c266d1f5441cc74be657c"} Apr 22 19:27:23.550839 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:23.550837 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5","Type":"ContainerStarted","Data":"03f800a372f9363564706cb8e58babfc0ec0282179a43fcbf17f34f43a0da54c"} Apr 22 19:27:23.550839 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:23.550847 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5","Type":"ContainerStarted","Data":"edbdd3d8e85b412e722fa1015a59652410809b377f6c0fea65d0fa17b7a1c4c5"} Apr 22 19:27:23.551454 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:23.550856 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5","Type":"ContainerStarted","Data":"5c14f789de0f1bc1ad1be8f0951783567dca9e99de63a5429d68feb02e4d8dfc"} Apr 22 19:27:23.551454 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:23.550864 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5","Type":"ContainerStarted","Data":"7a994b6b9a54154653488b9fbc601e5dc9e00e8e5fcb7cec1bf45c47241dce39"} Apr 22 19:27:23.551454 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:23.550871 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad0a23d8-c6df-4985-ae22-96cdc1d30ef5","Type":"ContainerStarted","Data":"2ad02cfd8efe5629cf17aac9f247f1f5b6436150a5a5365c29986184c4aa2277"} Apr 22 19:27:23.580155 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:23.580106 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.58009137 podStartE2EDuration="2.58009137s" podCreationTimestamp="2026-04-22 19:27:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:27:23.578318029 +0000 UTC m=+246.383159332" watchObservedRunningTime="2026-04-22 19:27:23.58009137 +0000 UTC m=+246.384932706" Apr 22 19:27:24.180348 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.180311 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-66495fb49d-kb29w"] Apr 22 19:27:24.183754 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.183717 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.186762 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.186740 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 19:27:24.186954 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.186794 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 19:27:24.186954 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.186899 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-crkbx\"" Apr 22 19:27:24.187130 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.187111 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 19:27:24.187249 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.187117 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 19:27:24.189502 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.189483 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 19:27:24.192429 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.192409 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 19:27:24.206658 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.206630 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-66495fb49d-kb29w"] Apr 22 19:27:24.284396 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.284357 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/434a8f7d-a89d-4e80-8940-ec8a6154255c-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.284396 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.284401 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/434a8f7d-a89d-4e80-8940-ec8a6154255c-serving-certs-ca-bundle\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.284617 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.284423 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/434a8f7d-a89d-4e80-8940-ec8a6154255c-telemeter-client-tls\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.284617 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.284446 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/434a8f7d-a89d-4e80-8940-ec8a6154255c-metrics-client-ca\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.284617 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.284484 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/434a8f7d-a89d-4e80-8940-ec8a6154255c-federate-client-tls\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.284617 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.284501 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/434a8f7d-a89d-4e80-8940-ec8a6154255c-telemeter-trusted-ca-bundle\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.284617 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.284534 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcd6p\" (UniqueName: \"kubernetes.io/projected/434a8f7d-a89d-4e80-8940-ec8a6154255c-kube-api-access-xcd6p\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.284617 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.284553 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/434a8f7d-a89d-4e80-8940-ec8a6154255c-secret-telemeter-client\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.385534 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.385485 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/434a8f7d-a89d-4e80-8940-ec8a6154255c-serving-certs-ca-bundle\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.385534 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.385533 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/434a8f7d-a89d-4e80-8940-ec8a6154255c-telemeter-client-tls\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.385534 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.385550 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/434a8f7d-a89d-4e80-8940-ec8a6154255c-metrics-client-ca\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.385873 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.385599 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/434a8f7d-a89d-4e80-8940-ec8a6154255c-federate-client-tls\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.385873 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.385624 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/434a8f7d-a89d-4e80-8940-ec8a6154255c-telemeter-trusted-ca-bundle\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.385873 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.385805 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcd6p\" (UniqueName: \"kubernetes.io/projected/434a8f7d-a89d-4e80-8940-ec8a6154255c-kube-api-access-xcd6p\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.386027 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.385877 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/434a8f7d-a89d-4e80-8940-ec8a6154255c-secret-telemeter-client\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.386027 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.385981 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/434a8f7d-a89d-4e80-8940-ec8a6154255c-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.386392 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.386364 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/434a8f7d-a89d-4e80-8940-ec8a6154255c-serving-certs-ca-bundle\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.386508 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.386482 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/434a8f7d-a89d-4e80-8940-ec8a6154255c-telemeter-trusted-ca-bundle\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.386689 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.386661 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/434a8f7d-a89d-4e80-8940-ec8a6154255c-metrics-client-ca\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.388255 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.388232 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/434a8f7d-a89d-4e80-8940-ec8a6154255c-telemeter-client-tls\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.388336 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.388311 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/434a8f7d-a89d-4e80-8940-ec8a6154255c-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.388503 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.388475 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/434a8f7d-a89d-4e80-8940-ec8a6154255c-federate-client-tls\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.388541 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.388512 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/434a8f7d-a89d-4e80-8940-ec8a6154255c-secret-telemeter-client\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.394795 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.394772 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcd6p\" (UniqueName: \"kubernetes.io/projected/434a8f7d-a89d-4e80-8940-ec8a6154255c-kube-api-access-xcd6p\") pod \"telemeter-client-66495fb49d-kb29w\" (UID: \"434a8f7d-a89d-4e80-8940-ec8a6154255c\") " pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.494822 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.494724 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" Apr 22 19:27:24.642918 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:24.642880 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-66495fb49d-kb29w"] Apr 22 19:27:24.645658 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:27:24.645625 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod434a8f7d_a89d_4e80_8940_ec8a6154255c.slice/crio-7f4aa89eae6a9f3299ba51688e37cb9774c9c7c1af6a19b7fac0c3423321f797 WatchSource:0}: Error finding container 7f4aa89eae6a9f3299ba51688e37cb9774c9c7c1af6a19b7fac0c3423321f797: Status 404 returned error can't find the container with id 7f4aa89eae6a9f3299ba51688e37cb9774c9c7c1af6a19b7fac0c3423321f797 Apr 22 19:27:25.559658 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:25.559617 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" event={"ID":"434a8f7d-a89d-4e80-8940-ec8a6154255c","Type":"ContainerStarted","Data":"7f4aa89eae6a9f3299ba51688e37cb9774c9c7c1af6a19b7fac0c3423321f797"} Apr 22 19:27:26.564178 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:26.564150 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" event={"ID":"434a8f7d-a89d-4e80-8940-ec8a6154255c","Type":"ContainerStarted","Data":"c10f14dc3d3c4227b90d3cf862098b0d335913601d5b4b961fc9461b0e78eac7"} Apr 22 19:27:26.564578 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:26.564184 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" event={"ID":"434a8f7d-a89d-4e80-8940-ec8a6154255c","Type":"ContainerStarted","Data":"a505105cbdf2fc60e94a8287d90e8a5696f6f2b42576b1827a1685c49b2d768c"} Apr 22 19:27:26.564578 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:26.564193 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" event={"ID":"434a8f7d-a89d-4e80-8940-ec8a6154255c","Type":"ContainerStarted","Data":"a9e0f5254f8e8a4819b5c914e77edb0cf56116dacef83a97f10c6bd809fdfaa4"} Apr 22 19:27:26.588032 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:26.587972 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-66495fb49d-kb29w" podStartSLOduration=0.874874687 podStartE2EDuration="2.587952124s" podCreationTimestamp="2026-04-22 19:27:24 +0000 UTC" firstStartedPulling="2026-04-22 19:27:24.647544721 +0000 UTC m=+247.452386006" lastFinishedPulling="2026-04-22 19:27:26.360622151 +0000 UTC m=+249.165463443" observedRunningTime="2026-04-22 19:27:26.587311183 +0000 UTC m=+249.392152487" watchObservedRunningTime="2026-04-22 19:27:26.587952124 +0000 UTC m=+249.392793429" Apr 22 19:27:29.634845 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:29.634797 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:27:29.637211 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:29.637183 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c0f6922-8799-4caa-adfb-fa958fee9291-metrics-certs\") pod \"network-metrics-daemon-qwbg8\" (UID: \"9c0f6922-8799-4caa-adfb-fa958fee9291\") " pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:27:29.705097 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:29.705061 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kmhxv\"" Apr 22 19:27:29.712991 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:29.712963 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qwbg8" Apr 22 19:27:29.854596 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:29.854534 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qwbg8"] Apr 22 19:27:29.857611 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:27:29.857580 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c0f6922_8799_4caa_adfb_fa958fee9291.slice/crio-5037c22b1fa1011deb01810d9fc87020c1978c7411478ccdd7d7c92585a87fc6 WatchSource:0}: Error finding container 5037c22b1fa1011deb01810d9fc87020c1978c7411478ccdd7d7c92585a87fc6: Status 404 returned error can't find the container with id 5037c22b1fa1011deb01810d9fc87020c1978c7411478ccdd7d7c92585a87fc6 Apr 22 19:27:30.577033 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:30.576992 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qwbg8" event={"ID":"9c0f6922-8799-4caa-adfb-fa958fee9291","Type":"ContainerStarted","Data":"5037c22b1fa1011deb01810d9fc87020c1978c7411478ccdd7d7c92585a87fc6"} Apr 22 19:27:31.581973 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:31.581940 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qwbg8" event={"ID":"9c0f6922-8799-4caa-adfb-fa958fee9291","Type":"ContainerStarted","Data":"2527c42697364c2fa9b98a9b114c5372301037c52c956847f6358ca41f0778a2"} Apr 22 19:27:31.581973 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:31.581974 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qwbg8" event={"ID":"9c0f6922-8799-4caa-adfb-fa958fee9291","Type":"ContainerStarted","Data":"a5662ca8391943b9652c2d4c556392ce4084100c112583763b5df8384e57ab9a"} Apr 22 19:27:31.602949 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:27:31.602897 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qwbg8" podStartSLOduration=253.730527018 podStartE2EDuration="4m14.602880598s" podCreationTimestamp="2026-04-22 19:23:17 +0000 UTC" firstStartedPulling="2026-04-22 19:27:29.862706172 +0000 UTC m=+252.667547459" lastFinishedPulling="2026-04-22 19:27:30.735059754 +0000 UTC m=+253.539901039" observedRunningTime="2026-04-22 19:27:31.598617776 +0000 UTC m=+254.403459080" watchObservedRunningTime="2026-04-22 19:27:31.602880598 +0000 UTC m=+254.407721881" Apr 22 19:28:17.681603 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:17.681577 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:28:17.683265 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:17.683233 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:28:17.688664 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:17.688640 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:28:36.415155 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.415119 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d8c8fd9d5-dvh72"] Apr 22 19:28:36.418310 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.418290 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.422398 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.422378 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:28:36.423431 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.423410 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:28:36.423532 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.423444 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 19:28:36.423702 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.423680 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:28:36.423798 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.423752 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:28:36.423858 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.423814 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:28:36.424870 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.424852 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qlpqn\"" Apr 22 19:28:36.425086 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.425065 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 19:28:36.429433 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.429415 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 19:28:36.433267 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.433244 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d8c8fd9d5-dvh72"] Apr 22 19:28:36.489891 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.489859 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-oauth-serving-cert\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.489891 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.489901 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-serving-cert\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.490103 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.489925 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-service-ca\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.490103 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.489951 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llf98\" (UniqueName: \"kubernetes.io/projected/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-kube-api-access-llf98\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.490103 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.489970 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-trusted-ca-bundle\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.490103 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.489997 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-config\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.490103 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.490019 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-oauth-config\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.591170 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.591130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-oauth-serving-cert\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.591445 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.591428 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-serving-cert\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.591581 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.591560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-service-ca\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.591702 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.591688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llf98\" (UniqueName: \"kubernetes.io/projected/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-kube-api-access-llf98\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.591855 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.591838 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-trusted-ca-bundle\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.591989 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.591976 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-config\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.592095 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.592083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-oauth-config\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.595757 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.595323 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-oauth-config\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.596054 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.596032 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-oauth-serving-cert\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.596466 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.596434 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-service-ca\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.596942 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.596915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-trusted-ca-bundle\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.597378 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.597358 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-config\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.598349 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.598331 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-serving-cert\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.604851 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.604828 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llf98\" (UniqueName: \"kubernetes.io/projected/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-kube-api-access-llf98\") pod \"console-d8c8fd9d5-dvh72\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.727862 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.727721 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:36.860274 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.860247 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d8c8fd9d5-dvh72"] Apr 22 19:28:36.862552 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:28:36.862519 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod511aeacd_41b4_4f30_8f5e_d1c54408bfc1.slice/crio-16c7e8713fcc2b003eb672e4c7e84ce9781917161d08aa1e6044af38e820eb7a WatchSource:0}: Error finding container 16c7e8713fcc2b003eb672e4c7e84ce9781917161d08aa1e6044af38e820eb7a: Status 404 returned error can't find the container with id 16c7e8713fcc2b003eb672e4c7e84ce9781917161d08aa1e6044af38e820eb7a Apr 22 19:28:36.864320 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:36.864306 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:28:37.783446 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:37.783413 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8c8fd9d5-dvh72" event={"ID":"511aeacd-41b4-4f30-8f5e-d1c54408bfc1","Type":"ContainerStarted","Data":"a8688f1426283ab583ac97468e7d656cb2750d8527b51970268d6015afd6d441"} Apr 22 19:28:37.783446 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:37.783451 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8c8fd9d5-dvh72" event={"ID":"511aeacd-41b4-4f30-8f5e-d1c54408bfc1","Type":"ContainerStarted","Data":"16c7e8713fcc2b003eb672e4c7e84ce9781917161d08aa1e6044af38e820eb7a"} Apr 22 19:28:37.804251 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:37.804205 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d8c8fd9d5-dvh72" podStartSLOduration=1.804188122 podStartE2EDuration="1.804188122s" podCreationTimestamp="2026-04-22 19:28:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:28:37.803049484 +0000 UTC m=+320.607890847" watchObservedRunningTime="2026-04-22 19:28:37.804188122 +0000 UTC m=+320.609029428" Apr 22 19:28:46.728622 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:46.728564 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:46.728622 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:46.728620 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:46.738040 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:46.737435 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:28:46.814261 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:28:46.814232 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:32:03.708626 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.708540 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-kdcpf"] Apr 22 19:32:03.712196 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.712173 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" Apr 22 19:32:03.716462 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.716436 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 19:32:03.716606 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.716459 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 19:32:03.716606 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.716491 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 19:32:03.716606 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.716459 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-q76zq\"" Apr 22 19:32:03.721838 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.721811 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-kdcpf"] Apr 22 19:32:03.752841 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.752806 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-hv5pj"] Apr 22 19:32:03.755994 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.755975 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-hv5pj" Apr 22 19:32:03.758656 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.758635 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4455v\"" Apr 22 19:32:03.758811 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.758635 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 19:32:03.766248 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.766223 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-hv5pj"] Apr 22 19:32:03.842775 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.842712 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzshp\" (UniqueName: \"kubernetes.io/projected/7e60006c-826d-4997-9ffc-e6d055a80ad6-kube-api-access-kzshp\") pod \"seaweedfs-86cc847c5c-hv5pj\" (UID: \"7e60006c-826d-4997-9ffc-e6d055a80ad6\") " pod="kserve/seaweedfs-86cc847c5c-hv5pj" Apr 22 19:32:03.842983 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.842794 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slmzz\" (UniqueName: \"kubernetes.io/projected/ef3c45ce-7332-474f-b728-c8fe056f81fd-kube-api-access-slmzz\") pod \"kserve-controller-manager-545d8995fb-kdcpf\" (UID: \"ef3c45ce-7332-474f-b728-c8fe056f81fd\") " pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" Apr 22 19:32:03.842983 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.842816 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7e60006c-826d-4997-9ffc-e6d055a80ad6-data\") pod \"seaweedfs-86cc847c5c-hv5pj\" (UID: \"7e60006c-826d-4997-9ffc-e6d055a80ad6\") " pod="kserve/seaweedfs-86cc847c5c-hv5pj" Apr 22 19:32:03.842983 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.842929 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef3c45ce-7332-474f-b728-c8fe056f81fd-cert\") pod \"kserve-controller-manager-545d8995fb-kdcpf\" (UID: \"ef3c45ce-7332-474f-b728-c8fe056f81fd\") " pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" Apr 22 19:32:03.944287 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.944247 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzshp\" (UniqueName: \"kubernetes.io/projected/7e60006c-826d-4997-9ffc-e6d055a80ad6-kube-api-access-kzshp\") pod \"seaweedfs-86cc847c5c-hv5pj\" (UID: \"7e60006c-826d-4997-9ffc-e6d055a80ad6\") " pod="kserve/seaweedfs-86cc847c5c-hv5pj" Apr 22 19:32:03.944499 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.944302 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-slmzz\" (UniqueName: \"kubernetes.io/projected/ef3c45ce-7332-474f-b728-c8fe056f81fd-kube-api-access-slmzz\") pod \"kserve-controller-manager-545d8995fb-kdcpf\" (UID: \"ef3c45ce-7332-474f-b728-c8fe056f81fd\") " pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" Apr 22 19:32:03.944499 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.944321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7e60006c-826d-4997-9ffc-e6d055a80ad6-data\") pod \"seaweedfs-86cc847c5c-hv5pj\" (UID: \"7e60006c-826d-4997-9ffc-e6d055a80ad6\") " pod="kserve/seaweedfs-86cc847c5c-hv5pj" Apr 22 19:32:03.944499 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.944366 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef3c45ce-7332-474f-b728-c8fe056f81fd-cert\") pod \"kserve-controller-manager-545d8995fb-kdcpf\" (UID: \"ef3c45ce-7332-474f-b728-c8fe056f81fd\") " pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" Apr 22 19:32:03.944830 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.944804 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7e60006c-826d-4997-9ffc-e6d055a80ad6-data\") pod \"seaweedfs-86cc847c5c-hv5pj\" (UID: \"7e60006c-826d-4997-9ffc-e6d055a80ad6\") " pod="kserve/seaweedfs-86cc847c5c-hv5pj" Apr 22 19:32:03.946784 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.946766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef3c45ce-7332-474f-b728-c8fe056f81fd-cert\") pod \"kserve-controller-manager-545d8995fb-kdcpf\" (UID: \"ef3c45ce-7332-474f-b728-c8fe056f81fd\") " pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" Apr 22 19:32:03.953268 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.953239 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-slmzz\" (UniqueName: \"kubernetes.io/projected/ef3c45ce-7332-474f-b728-c8fe056f81fd-kube-api-access-slmzz\") pod \"kserve-controller-manager-545d8995fb-kdcpf\" (UID: \"ef3c45ce-7332-474f-b728-c8fe056f81fd\") " pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" Apr 22 19:32:03.953404 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:03.953289 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzshp\" (UniqueName: \"kubernetes.io/projected/7e60006c-826d-4997-9ffc-e6d055a80ad6-kube-api-access-kzshp\") pod \"seaweedfs-86cc847c5c-hv5pj\" (UID: \"7e60006c-826d-4997-9ffc-e6d055a80ad6\") " pod="kserve/seaweedfs-86cc847c5c-hv5pj" Apr 22 19:32:04.024468 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:04.024359 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" Apr 22 19:32:04.066595 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:04.066557 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-hv5pj" Apr 22 19:32:04.152899 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:04.152754 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-kdcpf"] Apr 22 19:32:04.156297 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:32:04.156138 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef3c45ce_7332_474f_b728_c8fe056f81fd.slice/crio-515c386efcc9c5a6fc9c3cbb76a30013678f8d776c4a5044e322534fecc2f03b WatchSource:0}: Error finding container 515c386efcc9c5a6fc9c3cbb76a30013678f8d776c4a5044e322534fecc2f03b: Status 404 returned error can't find the container with id 515c386efcc9c5a6fc9c3cbb76a30013678f8d776c4a5044e322534fecc2f03b Apr 22 19:32:04.203191 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:04.203115 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-hv5pj"] Apr 22 19:32:04.206277 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:32:04.206248 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e60006c_826d_4997_9ffc_e6d055a80ad6.slice/crio-45889b952705f5dfb01ce73ee0c5ec6ebcf1be90a9fc3ac39af4e76da86bfb09 WatchSource:0}: Error finding container 45889b952705f5dfb01ce73ee0c5ec6ebcf1be90a9fc3ac39af4e76da86bfb09: Status 404 returned error can't find the container with id 45889b952705f5dfb01ce73ee0c5ec6ebcf1be90a9fc3ac39af4e76da86bfb09 Apr 22 19:32:04.397589 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:04.397553 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" event={"ID":"ef3c45ce-7332-474f-b728-c8fe056f81fd","Type":"ContainerStarted","Data":"515c386efcc9c5a6fc9c3cbb76a30013678f8d776c4a5044e322534fecc2f03b"} Apr 22 19:32:04.398531 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:04.398508 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-hv5pj" event={"ID":"7e60006c-826d-4997-9ffc-e6d055a80ad6","Type":"ContainerStarted","Data":"45889b952705f5dfb01ce73ee0c5ec6ebcf1be90a9fc3ac39af4e76da86bfb09"} Apr 22 19:32:08.414185 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:08.414147 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" event={"ID":"ef3c45ce-7332-474f-b728-c8fe056f81fd","Type":"ContainerStarted","Data":"f1bce0573e889da4ebf35280b35c6e972564014f9f33a152dc7f802a8b5a8c44"} Apr 22 19:32:08.414641 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:08.414251 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" Apr 22 19:32:08.415524 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:08.415501 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-hv5pj" event={"ID":"7e60006c-826d-4997-9ffc-e6d055a80ad6","Type":"ContainerStarted","Data":"9c9126074233277148517ee0d7fc5aa4bba75abb0c13b36e72b7a4bc41b25bd4"} Apr 22 19:32:08.415632 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:08.415617 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-hv5pj" Apr 22 19:32:08.431036 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:08.430993 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" podStartSLOduration=1.81227839 podStartE2EDuration="5.430979863s" podCreationTimestamp="2026-04-22 19:32:03 +0000 UTC" firstStartedPulling="2026-04-22 19:32:04.158015144 +0000 UTC m=+526.962856427" lastFinishedPulling="2026-04-22 19:32:07.776716611 +0000 UTC m=+530.581557900" observedRunningTime="2026-04-22 19:32:08.430239846 +0000 UTC m=+531.235081152" watchObservedRunningTime="2026-04-22 19:32:08.430979863 +0000 UTC m=+531.235821167" Apr 22 19:32:08.446191 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:08.446145 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-hv5pj" podStartSLOduration=1.81889798 podStartE2EDuration="5.446129838s" podCreationTimestamp="2026-04-22 19:32:03 +0000 UTC" firstStartedPulling="2026-04-22 19:32:04.207687297 +0000 UTC m=+527.012528579" lastFinishedPulling="2026-04-22 19:32:07.83491914 +0000 UTC m=+530.639760437" observedRunningTime="2026-04-22 19:32:08.44466755 +0000 UTC m=+531.249508855" watchObservedRunningTime="2026-04-22 19:32:08.446129838 +0000 UTC m=+531.250971141" Apr 22 19:32:14.421448 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:14.421419 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-hv5pj" Apr 22 19:32:28.008680 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.008650 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d979f574c-9pjz9"] Apr 22 19:32:28.013558 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.013535 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.021573 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.021549 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d979f574c-9pjz9"] Apr 22 19:32:28.164206 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.164166 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f520a25a-85c4-4b6b-934e-0fa9933832ac-oauth-serving-cert\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.164399 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.164221 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f520a25a-85c4-4b6b-934e-0fa9933832ac-console-serving-cert\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.164399 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.164253 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f520a25a-85c4-4b6b-934e-0fa9933832ac-service-ca\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.164399 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.164289 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f520a25a-85c4-4b6b-934e-0fa9933832ac-trusted-ca-bundle\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.164399 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.164355 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f520a25a-85c4-4b6b-934e-0fa9933832ac-console-config\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.164399 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.164376 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f520a25a-85c4-4b6b-934e-0fa9933832ac-console-oauth-config\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.164560 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.164400 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbttj\" (UniqueName: \"kubernetes.io/projected/f520a25a-85c4-4b6b-934e-0fa9933832ac-kube-api-access-nbttj\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.265583 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.265479 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f520a25a-85c4-4b6b-934e-0fa9933832ac-console-serving-cert\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.265583 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.265528 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f520a25a-85c4-4b6b-934e-0fa9933832ac-service-ca\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.265830 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.265671 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f520a25a-85c4-4b6b-934e-0fa9933832ac-trusted-ca-bundle\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.265830 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.265765 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f520a25a-85c4-4b6b-934e-0fa9933832ac-console-config\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.265830 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.265796 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f520a25a-85c4-4b6b-934e-0fa9933832ac-console-oauth-config\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.265830 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.265818 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbttj\" (UniqueName: \"kubernetes.io/projected/f520a25a-85c4-4b6b-934e-0fa9933832ac-kube-api-access-nbttj\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.266027 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.265864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f520a25a-85c4-4b6b-934e-0fa9933832ac-oauth-serving-cert\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.266334 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.266304 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f520a25a-85c4-4b6b-934e-0fa9933832ac-service-ca\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.266457 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.266337 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f520a25a-85c4-4b6b-934e-0fa9933832ac-console-config\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.266545 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.266516 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f520a25a-85c4-4b6b-934e-0fa9933832ac-oauth-serving-cert\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.266785 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.266761 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f520a25a-85c4-4b6b-934e-0fa9933832ac-trusted-ca-bundle\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.268379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.268356 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f520a25a-85c4-4b6b-934e-0fa9933832ac-console-serving-cert\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.268526 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.268509 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f520a25a-85c4-4b6b-934e-0fa9933832ac-console-oauth-config\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.277384 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.277354 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbttj\" (UniqueName: \"kubernetes.io/projected/f520a25a-85c4-4b6b-934e-0fa9933832ac-kube-api-access-nbttj\") pod \"console-5d979f574c-9pjz9\" (UID: \"f520a25a-85c4-4b6b-934e-0fa9933832ac\") " pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.323523 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.323484 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:28.449233 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.449201 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d979f574c-9pjz9"] Apr 22 19:32:28.451851 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:32:28.451817 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf520a25a_85c4_4b6b_934e_0fa9933832ac.slice/crio-6ef8cffeed16545da0c708a9627cf640694aad0746cdd07c38735ef3657a6266 WatchSource:0}: Error finding container 6ef8cffeed16545da0c708a9627cf640694aad0746cdd07c38735ef3657a6266: Status 404 returned error can't find the container with id 6ef8cffeed16545da0c708a9627cf640694aad0746cdd07c38735ef3657a6266 Apr 22 19:32:28.478182 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:28.478151 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d979f574c-9pjz9" event={"ID":"f520a25a-85c4-4b6b-934e-0fa9933832ac","Type":"ContainerStarted","Data":"6ef8cffeed16545da0c708a9627cf640694aad0746cdd07c38735ef3657a6266"} Apr 22 19:32:29.482017 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:29.481975 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d979f574c-9pjz9" event={"ID":"f520a25a-85c4-4b6b-934e-0fa9933832ac","Type":"ContainerStarted","Data":"96f046130c57ae430df9ad278e7a5dda031b2bdc858ecd83fbc3c1a243229482"} Apr 22 19:32:29.501218 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:29.501167 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d979f574c-9pjz9" podStartSLOduration=2.501153361 podStartE2EDuration="2.501153361s" podCreationTimestamp="2026-04-22 19:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:32:29.499535975 +0000 UTC m=+552.304377289" watchObservedRunningTime="2026-04-22 19:32:29.501153361 +0000 UTC m=+552.305994665" Apr 22 19:32:38.324076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:38.324030 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:38.324076 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:38.324085 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:38.328586 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:38.328562 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:38.511350 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:38.511322 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d979f574c-9pjz9" Apr 22 19:32:38.559540 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:38.559505 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d8c8fd9d5-dvh72"] Apr 22 19:32:39.424419 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.424384 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" Apr 22 19:32:39.652902 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.652857 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-kdcpf"] Apr 22 19:32:39.653186 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.653141 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" podUID="ef3c45ce-7332-474f-b728-c8fe056f81fd" containerName="manager" containerID="cri-o://f1bce0573e889da4ebf35280b35c6e972564014f9f33a152dc7f802a8b5a8c44" gracePeriod=10 Apr 22 19:32:39.678422 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.678360 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-7258g"] Apr 22 19:32:39.684303 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.681624 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-7258g" Apr 22 19:32:39.688595 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.688568 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-7258g"] Apr 22 19:32:39.762462 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.762419 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjlx5\" (UniqueName: \"kubernetes.io/projected/f5adafa5-fc01-4bc5-8639-c47908aec837-kube-api-access-tjlx5\") pod \"kserve-controller-manager-545d8995fb-7258g\" (UID: \"f5adafa5-fc01-4bc5-8639-c47908aec837\") " pod="kserve/kserve-controller-manager-545d8995fb-7258g" Apr 22 19:32:39.762618 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.762492 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5adafa5-fc01-4bc5-8639-c47908aec837-cert\") pod \"kserve-controller-manager-545d8995fb-7258g\" (UID: \"f5adafa5-fc01-4bc5-8639-c47908aec837\") " pod="kserve/kserve-controller-manager-545d8995fb-7258g" Apr 22 19:32:39.863432 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.863391 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjlx5\" (UniqueName: \"kubernetes.io/projected/f5adafa5-fc01-4bc5-8639-c47908aec837-kube-api-access-tjlx5\") pod \"kserve-controller-manager-545d8995fb-7258g\" (UID: \"f5adafa5-fc01-4bc5-8639-c47908aec837\") " pod="kserve/kserve-controller-manager-545d8995fb-7258g" Apr 22 19:32:39.863622 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.863451 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5adafa5-fc01-4bc5-8639-c47908aec837-cert\") pod \"kserve-controller-manager-545d8995fb-7258g\" (UID: \"f5adafa5-fc01-4bc5-8639-c47908aec837\") " pod="kserve/kserve-controller-manager-545d8995fb-7258g" Apr 22 19:32:39.865863 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.865836 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5adafa5-fc01-4bc5-8639-c47908aec837-cert\") pod \"kserve-controller-manager-545d8995fb-7258g\" (UID: \"f5adafa5-fc01-4bc5-8639-c47908aec837\") " pod="kserve/kserve-controller-manager-545d8995fb-7258g" Apr 22 19:32:39.872821 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.872792 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjlx5\" (UniqueName: \"kubernetes.io/projected/f5adafa5-fc01-4bc5-8639-c47908aec837-kube-api-access-tjlx5\") pod \"kserve-controller-manager-545d8995fb-7258g\" (UID: \"f5adafa5-fc01-4bc5-8639-c47908aec837\") " pod="kserve/kserve-controller-manager-545d8995fb-7258g" Apr 22 19:32:39.896698 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.896676 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" Apr 22 19:32:39.964342 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.964253 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slmzz\" (UniqueName: \"kubernetes.io/projected/ef3c45ce-7332-474f-b728-c8fe056f81fd-kube-api-access-slmzz\") pod \"ef3c45ce-7332-474f-b728-c8fe056f81fd\" (UID: \"ef3c45ce-7332-474f-b728-c8fe056f81fd\") " Apr 22 19:32:39.964342 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.964339 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef3c45ce-7332-474f-b728-c8fe056f81fd-cert\") pod \"ef3c45ce-7332-474f-b728-c8fe056f81fd\" (UID: \"ef3c45ce-7332-474f-b728-c8fe056f81fd\") " Apr 22 19:32:39.966588 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.966557 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef3c45ce-7332-474f-b728-c8fe056f81fd-cert" (OuterVolumeSpecName: "cert") pod "ef3c45ce-7332-474f-b728-c8fe056f81fd" (UID: "ef3c45ce-7332-474f-b728-c8fe056f81fd"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:32:39.966780 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:39.966601 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef3c45ce-7332-474f-b728-c8fe056f81fd-kube-api-access-slmzz" (OuterVolumeSpecName: "kube-api-access-slmzz") pod "ef3c45ce-7332-474f-b728-c8fe056f81fd" (UID: "ef3c45ce-7332-474f-b728-c8fe056f81fd"). InnerVolumeSpecName "kube-api-access-slmzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:32:40.019822 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.019788 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-7258g" Apr 22 19:32:40.065120 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.065085 2579 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef3c45ce-7332-474f-b728-c8fe056f81fd-cert\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:32:40.065300 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.065124 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-slmzz\" (UniqueName: \"kubernetes.io/projected/ef3c45ce-7332-474f-b728-c8fe056f81fd-kube-api-access-slmzz\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:32:40.144033 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.143959 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-7258g"] Apr 22 19:32:40.146301 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:32:40.146275 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5adafa5_fc01_4bc5_8639_c47908aec837.slice/crio-924b1d3a514c36afc425941c4e204760dfe699791085b2b1c65bb623ce89cb6a WatchSource:0}: Error finding container 924b1d3a514c36afc425941c4e204760dfe699791085b2b1c65bb623ce89cb6a: Status 404 returned error can't find the container with id 924b1d3a514c36afc425941c4e204760dfe699791085b2b1c65bb623ce89cb6a Apr 22 19:32:40.517245 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.517203 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-7258g" event={"ID":"f5adafa5-fc01-4bc5-8639-c47908aec837","Type":"ContainerStarted","Data":"924b1d3a514c36afc425941c4e204760dfe699791085b2b1c65bb623ce89cb6a"} Apr 22 19:32:40.518412 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.518386 2579 generic.go:358] "Generic (PLEG): container finished" podID="ef3c45ce-7332-474f-b728-c8fe056f81fd" containerID="f1bce0573e889da4ebf35280b35c6e972564014f9f33a152dc7f802a8b5a8c44" exitCode=0 Apr 22 19:32:40.518499 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.518437 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" event={"ID":"ef3c45ce-7332-474f-b728-c8fe056f81fd","Type":"ContainerDied","Data":"f1bce0573e889da4ebf35280b35c6e972564014f9f33a152dc7f802a8b5a8c44"} Apr 22 19:32:40.518499 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.518456 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" Apr 22 19:32:40.518499 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.518475 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-kdcpf" event={"ID":"ef3c45ce-7332-474f-b728-c8fe056f81fd","Type":"ContainerDied","Data":"515c386efcc9c5a6fc9c3cbb76a30013678f8d776c4a5044e322534fecc2f03b"} Apr 22 19:32:40.518499 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.518491 2579 scope.go:117] "RemoveContainer" containerID="f1bce0573e889da4ebf35280b35c6e972564014f9f33a152dc7f802a8b5a8c44" Apr 22 19:32:40.527793 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.527768 2579 scope.go:117] "RemoveContainer" containerID="f1bce0573e889da4ebf35280b35c6e972564014f9f33a152dc7f802a8b5a8c44" Apr 22 19:32:40.528135 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:32:40.528106 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1bce0573e889da4ebf35280b35c6e972564014f9f33a152dc7f802a8b5a8c44\": container with ID starting with f1bce0573e889da4ebf35280b35c6e972564014f9f33a152dc7f802a8b5a8c44 not found: ID does not exist" containerID="f1bce0573e889da4ebf35280b35c6e972564014f9f33a152dc7f802a8b5a8c44" Apr 22 19:32:40.528238 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.528147 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1bce0573e889da4ebf35280b35c6e972564014f9f33a152dc7f802a8b5a8c44"} err="failed to get container status \"f1bce0573e889da4ebf35280b35c6e972564014f9f33a152dc7f802a8b5a8c44\": rpc error: code = NotFound desc = could not find container \"f1bce0573e889da4ebf35280b35c6e972564014f9f33a152dc7f802a8b5a8c44\": container with ID starting with f1bce0573e889da4ebf35280b35c6e972564014f9f33a152dc7f802a8b5a8c44 not found: ID does not exist" Apr 22 19:32:40.558192 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.558162 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-kdcpf"] Apr 22 19:32:40.561984 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:40.561950 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-kdcpf"] Apr 22 19:32:41.523209 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:41.523171 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-7258g" event={"ID":"f5adafa5-fc01-4bc5-8639-c47908aec837","Type":"ContainerStarted","Data":"a549ede69dc6cf8e73c8264af3cb9adc9c5922d9dd7dd0dbb2c861061306b64b"} Apr 22 19:32:41.523701 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:41.523349 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-545d8995fb-7258g" Apr 22 19:32:41.541868 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:41.541813 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-545d8995fb-7258g" podStartSLOduration=2.178932171 podStartE2EDuration="2.54179715s" podCreationTimestamp="2026-04-22 19:32:39 +0000 UTC" firstStartedPulling="2026-04-22 19:32:40.147541428 +0000 UTC m=+562.952382714" lastFinishedPulling="2026-04-22 19:32:40.510406402 +0000 UTC m=+563.315247693" observedRunningTime="2026-04-22 19:32:41.539825537 +0000 UTC m=+564.344666841" watchObservedRunningTime="2026-04-22 19:32:41.54179715 +0000 UTC m=+564.346638453" Apr 22 19:32:41.805453 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:32:41.805369 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef3c45ce-7332-474f-b728-c8fe056f81fd" path="/var/lib/kubelet/pods/ef3c45ce-7332-474f-b728-c8fe056f81fd/volumes" Apr 22 19:33:03.581633 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.581562 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-d8c8fd9d5-dvh72" podUID="511aeacd-41b4-4f30-8f5e-d1c54408bfc1" containerName="console" containerID="cri-o://a8688f1426283ab583ac97468e7d656cb2750d8527b51970268d6015afd6d441" gracePeriod=15 Apr 22 19:33:03.828829 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.828795 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d8c8fd9d5-dvh72_511aeacd-41b4-4f30-8f5e-d1c54408bfc1/console/0.log" Apr 22 19:33:03.828987 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.828867 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:33:03.977984 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.977943 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-serving-cert\") pod \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " Apr 22 19:33:03.978185 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.978028 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-oauth-config\") pod \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " Apr 22 19:33:03.978185 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.978063 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-oauth-serving-cert\") pod \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " Apr 22 19:33:03.978185 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.978113 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-service-ca\") pod \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " Apr 22 19:33:03.978185 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.978172 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-trusted-ca-bundle\") pod \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " Apr 22 19:33:03.978403 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.978216 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llf98\" (UniqueName: \"kubernetes.io/projected/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-kube-api-access-llf98\") pod \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " Apr 22 19:33:03.978403 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.978257 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-config\") pod \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\" (UID: \"511aeacd-41b4-4f30-8f5e-d1c54408bfc1\") " Apr 22 19:33:03.978525 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.978463 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "511aeacd-41b4-4f30-8f5e-d1c54408bfc1" (UID: "511aeacd-41b4-4f30-8f5e-d1c54408bfc1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:33:03.978579 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.978556 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "511aeacd-41b4-4f30-8f5e-d1c54408bfc1" (UID: "511aeacd-41b4-4f30-8f5e-d1c54408bfc1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:33:03.978720 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.978691 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-service-ca" (OuterVolumeSpecName: "service-ca") pod "511aeacd-41b4-4f30-8f5e-d1c54408bfc1" (UID: "511aeacd-41b4-4f30-8f5e-d1c54408bfc1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:33:03.978819 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.978789 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-config" (OuterVolumeSpecName: "console-config") pod "511aeacd-41b4-4f30-8f5e-d1c54408bfc1" (UID: "511aeacd-41b4-4f30-8f5e-d1c54408bfc1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:33:03.980330 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.980304 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "511aeacd-41b4-4f30-8f5e-d1c54408bfc1" (UID: "511aeacd-41b4-4f30-8f5e-d1c54408bfc1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:33:03.980451 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.980336 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "511aeacd-41b4-4f30-8f5e-d1c54408bfc1" (UID: "511aeacd-41b4-4f30-8f5e-d1c54408bfc1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:33:03.980451 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:03.980391 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-kube-api-access-llf98" (OuterVolumeSpecName: "kube-api-access-llf98") pod "511aeacd-41b4-4f30-8f5e-d1c54408bfc1" (UID: "511aeacd-41b4-4f30-8f5e-d1c54408bfc1"). InnerVolumeSpecName "kube-api-access-llf98". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:33:04.079573 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.079530 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-trusted-ca-bundle\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:33:04.079573 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.079565 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-llf98\" (UniqueName: \"kubernetes.io/projected/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-kube-api-access-llf98\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:33:04.079573 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.079576 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-config\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:33:04.079573 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.079585 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-serving-cert\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:33:04.079877 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.079593 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-console-oauth-config\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:33:04.079877 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.079602 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-oauth-serving-cert\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:33:04.079877 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.079612 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/511aeacd-41b4-4f30-8f5e-d1c54408bfc1-service-ca\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:33:04.595450 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.595416 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d8c8fd9d5-dvh72_511aeacd-41b4-4f30-8f5e-d1c54408bfc1/console/0.log" Apr 22 19:33:04.595927 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.595457 2579 generic.go:358] "Generic (PLEG): container finished" podID="511aeacd-41b4-4f30-8f5e-d1c54408bfc1" containerID="a8688f1426283ab583ac97468e7d656cb2750d8527b51970268d6015afd6d441" exitCode=2 Apr 22 19:33:04.595927 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.595552 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8c8fd9d5-dvh72" Apr 22 19:33:04.595927 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.595561 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8c8fd9d5-dvh72" event={"ID":"511aeacd-41b4-4f30-8f5e-d1c54408bfc1","Type":"ContainerDied","Data":"a8688f1426283ab583ac97468e7d656cb2750d8527b51970268d6015afd6d441"} Apr 22 19:33:04.595927 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.595611 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8c8fd9d5-dvh72" event={"ID":"511aeacd-41b4-4f30-8f5e-d1c54408bfc1","Type":"ContainerDied","Data":"16c7e8713fcc2b003eb672e4c7e84ce9781917161d08aa1e6044af38e820eb7a"} Apr 22 19:33:04.595927 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.595633 2579 scope.go:117] "RemoveContainer" containerID="a8688f1426283ab583ac97468e7d656cb2750d8527b51970268d6015afd6d441" Apr 22 19:33:04.604371 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.604354 2579 scope.go:117] "RemoveContainer" containerID="a8688f1426283ab583ac97468e7d656cb2750d8527b51970268d6015afd6d441" Apr 22 19:33:04.604657 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:33:04.604640 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8688f1426283ab583ac97468e7d656cb2750d8527b51970268d6015afd6d441\": container with ID starting with a8688f1426283ab583ac97468e7d656cb2750d8527b51970268d6015afd6d441 not found: ID does not exist" containerID="a8688f1426283ab583ac97468e7d656cb2750d8527b51970268d6015afd6d441" Apr 22 19:33:04.604720 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.604671 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8688f1426283ab583ac97468e7d656cb2750d8527b51970268d6015afd6d441"} err="failed to get container status \"a8688f1426283ab583ac97468e7d656cb2750d8527b51970268d6015afd6d441\": rpc error: code = NotFound desc = could not find container \"a8688f1426283ab583ac97468e7d656cb2750d8527b51970268d6015afd6d441\": container with ID starting with a8688f1426283ab583ac97468e7d656cb2750d8527b51970268d6015afd6d441 not found: ID does not exist" Apr 22 19:33:04.618064 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.618027 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d8c8fd9d5-dvh72"] Apr 22 19:33:04.622368 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:04.622337 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d8c8fd9d5-dvh72"] Apr 22 19:33:05.804965 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:05.804933 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="511aeacd-41b4-4f30-8f5e-d1c54408bfc1" path="/var/lib/kubelet/pods/511aeacd-41b4-4f30-8f5e-d1c54408bfc1/volumes" Apr 22 19:33:12.532445 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:12.532416 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-545d8995fb-7258g" Apr 22 19:33:13.348410 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.348377 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-bq8rn"] Apr 22 19:33:13.348784 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.348769 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="511aeacd-41b4-4f30-8f5e-d1c54408bfc1" containerName="console" Apr 22 19:33:13.348784 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.348786 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="511aeacd-41b4-4f30-8f5e-d1c54408bfc1" containerName="console" Apr 22 19:33:13.348881 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.348797 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef3c45ce-7332-474f-b728-c8fe056f81fd" containerName="manager" Apr 22 19:33:13.348881 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.348803 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3c45ce-7332-474f-b728-c8fe056f81fd" containerName="manager" Apr 22 19:33:13.348881 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.348870 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="511aeacd-41b4-4f30-8f5e-d1c54408bfc1" containerName="console" Apr 22 19:33:13.348881 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.348879 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef3c45ce-7332-474f-b728-c8fe056f81fd" containerName="manager" Apr 22 19:33:13.350585 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.350566 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-bq8rn" Apr 22 19:33:13.353600 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.353577 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-4pzsm\"" Apr 22 19:33:13.353743 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.353577 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 19:33:13.364483 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.364460 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-bq8rn"] Apr 22 19:33:13.460216 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.460180 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/160df6da-5e63-48cf-bf25-63821f86465a-tls-certs\") pod \"model-serving-api-86f7b4b499-bq8rn\" (UID: \"160df6da-5e63-48cf-bf25-63821f86465a\") " pod="kserve/model-serving-api-86f7b4b499-bq8rn" Apr 22 19:33:13.460386 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.460223 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcd4z\" (UniqueName: \"kubernetes.io/projected/160df6da-5e63-48cf-bf25-63821f86465a-kube-api-access-fcd4z\") pod \"model-serving-api-86f7b4b499-bq8rn\" (UID: \"160df6da-5e63-48cf-bf25-63821f86465a\") " pod="kserve/model-serving-api-86f7b4b499-bq8rn" Apr 22 19:33:13.560811 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.560763 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/160df6da-5e63-48cf-bf25-63821f86465a-tls-certs\") pod \"model-serving-api-86f7b4b499-bq8rn\" (UID: \"160df6da-5e63-48cf-bf25-63821f86465a\") " pod="kserve/model-serving-api-86f7b4b499-bq8rn" Apr 22 19:33:13.561279 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.560826 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcd4z\" (UniqueName: \"kubernetes.io/projected/160df6da-5e63-48cf-bf25-63821f86465a-kube-api-access-fcd4z\") pod \"model-serving-api-86f7b4b499-bq8rn\" (UID: \"160df6da-5e63-48cf-bf25-63821f86465a\") " pod="kserve/model-serving-api-86f7b4b499-bq8rn" Apr 22 19:33:13.561279 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:33:13.560878 2579 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 22 19:33:13.561279 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:33:13.560945 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/160df6da-5e63-48cf-bf25-63821f86465a-tls-certs podName:160df6da-5e63-48cf-bf25-63821f86465a nodeName:}" failed. No retries permitted until 2026-04-22 19:33:14.060928698 +0000 UTC m=+596.865769980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/160df6da-5e63-48cf-bf25-63821f86465a-tls-certs") pod "model-serving-api-86f7b4b499-bq8rn" (UID: "160df6da-5e63-48cf-bf25-63821f86465a") : secret "model-serving-api-tls" not found Apr 22 19:33:13.570816 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:13.570792 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcd4z\" (UniqueName: \"kubernetes.io/projected/160df6da-5e63-48cf-bf25-63821f86465a-kube-api-access-fcd4z\") pod \"model-serving-api-86f7b4b499-bq8rn\" (UID: \"160df6da-5e63-48cf-bf25-63821f86465a\") " pod="kserve/model-serving-api-86f7b4b499-bq8rn" Apr 22 19:33:14.064405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:14.064363 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/160df6da-5e63-48cf-bf25-63821f86465a-tls-certs\") pod \"model-serving-api-86f7b4b499-bq8rn\" (UID: \"160df6da-5e63-48cf-bf25-63821f86465a\") " pod="kserve/model-serving-api-86f7b4b499-bq8rn" Apr 22 19:33:14.066932 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:14.066906 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/160df6da-5e63-48cf-bf25-63821f86465a-tls-certs\") pod \"model-serving-api-86f7b4b499-bq8rn\" (UID: \"160df6da-5e63-48cf-bf25-63821f86465a\") " pod="kserve/model-serving-api-86f7b4b499-bq8rn" Apr 22 19:33:14.261600 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:14.261562 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-bq8rn" Apr 22 19:33:14.390981 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:14.390947 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-bq8rn"] Apr 22 19:33:14.391941 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:33:14.391910 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod160df6da_5e63_48cf_bf25_63821f86465a.slice/crio-f6af8c857b2ad0ac762427e970040a782a484d2e7843a7d006639b33627f83e0 WatchSource:0}: Error finding container f6af8c857b2ad0ac762427e970040a782a484d2e7843a7d006639b33627f83e0: Status 404 returned error can't find the container with id f6af8c857b2ad0ac762427e970040a782a484d2e7843a7d006639b33627f83e0 Apr 22 19:33:14.630159 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:14.630122 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-bq8rn" event={"ID":"160df6da-5e63-48cf-bf25-63821f86465a","Type":"ContainerStarted","Data":"f6af8c857b2ad0ac762427e970040a782a484d2e7843a7d006639b33627f83e0"} Apr 22 19:33:16.638072 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:16.638034 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-bq8rn" event={"ID":"160df6da-5e63-48cf-bf25-63821f86465a","Type":"ContainerStarted","Data":"610f55f772c780373ae749fc79b76149465769e91c1af02d69ea990cdd61b100"} Apr 22 19:33:16.638546 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:16.638125 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-bq8rn" Apr 22 19:33:16.654460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:16.654406 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-bq8rn" podStartSLOduration=2.254571999 podStartE2EDuration="3.654391481s" podCreationTimestamp="2026-04-22 19:33:13 +0000 UTC" firstStartedPulling="2026-04-22 19:33:14.393826948 +0000 UTC m=+597.198668234" lastFinishedPulling="2026-04-22 19:33:15.79364643 +0000 UTC m=+598.598487716" observedRunningTime="2026-04-22 19:33:16.652449242 +0000 UTC m=+599.457290547" watchObservedRunningTime="2026-04-22 19:33:16.654391481 +0000 UTC m=+599.459232785" Apr 22 19:33:17.708643 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:17.708605 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:33:17.713411 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:17.713387 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:33:27.645152 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:27.645123 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-bq8rn" Apr 22 19:33:49.105905 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:49.105865 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq"] Apr 22 19:33:49.108119 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:49.108102 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" Apr 22 19:33:49.110916 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:49.110894 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qbzcz\"" Apr 22 19:33:49.119783 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:49.119757 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq"] Apr 22 19:33:49.176447 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:49.176417 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a215e8e-0981-4f2f-bc93-4abe72b71b8f-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq\" (UID: \"6a215e8e-0981-4f2f-bc93-4abe72b71b8f\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" Apr 22 19:33:49.277596 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:49.277554 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a215e8e-0981-4f2f-bc93-4abe72b71b8f-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq\" (UID: \"6a215e8e-0981-4f2f-bc93-4abe72b71b8f\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" Apr 22 19:33:49.278013 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:49.277984 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a215e8e-0981-4f2f-bc93-4abe72b71b8f-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq\" (UID: \"6a215e8e-0981-4f2f-bc93-4abe72b71b8f\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" Apr 22 19:33:49.419492 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:49.419390 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" Apr 22 19:33:49.541814 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:49.541788 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq"] Apr 22 19:33:49.544665 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:33:49.544635 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a215e8e_0981_4f2f_bc93_4abe72b71b8f.slice/crio-f05d6466545be779f4844441f624560b45c888e495576839b149691608986abb WatchSource:0}: Error finding container f05d6466545be779f4844441f624560b45c888e495576839b149691608986abb: Status 404 returned error can't find the container with id f05d6466545be779f4844441f624560b45c888e495576839b149691608986abb Apr 22 19:33:49.547026 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:49.547009 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:33:49.739658 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:49.739570 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" event={"ID":"6a215e8e-0981-4f2f-bc93-4abe72b71b8f","Type":"ContainerStarted","Data":"f05d6466545be779f4844441f624560b45c888e495576839b149691608986abb"} Apr 22 19:33:52.758850 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:52.758809 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" event={"ID":"6a215e8e-0981-4f2f-bc93-4abe72b71b8f","Type":"ContainerStarted","Data":"ac40004501eca68c399f11efd95db198fbff751aecdbc56005249f0c54c49096"} Apr 22 19:33:56.772430 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:56.772343 2579 generic.go:358] "Generic (PLEG): container finished" podID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerID="ac40004501eca68c399f11efd95db198fbff751aecdbc56005249f0c54c49096" exitCode=0 Apr 22 19:33:56.772430 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:33:56.772394 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" event={"ID":"6a215e8e-0981-4f2f-bc93-4abe72b71b8f","Type":"ContainerDied","Data":"ac40004501eca68c399f11efd95db198fbff751aecdbc56005249f0c54c49096"} Apr 22 19:34:10.826257 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:10.826221 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" event={"ID":"6a215e8e-0981-4f2f-bc93-4abe72b71b8f","Type":"ContainerStarted","Data":"af9bc7b3387a948ce0886146ab185512639e33062dcbb8db9e7261f5febe4a92"} Apr 22 19:34:13.837985 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:13.837952 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" event={"ID":"6a215e8e-0981-4f2f-bc93-4abe72b71b8f","Type":"ContainerStarted","Data":"0cab98a80987efc31388db9f4fecb84060ba47fbbccb03d4078055bb32c1d3e1"} Apr 22 19:34:13.838437 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:13.838196 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" Apr 22 19:34:13.839802 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:13.839753 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:34:13.864481 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:13.864421 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podStartSLOduration=1.11221716 podStartE2EDuration="24.86440624s" podCreationTimestamp="2026-04-22 19:33:49 +0000 UTC" firstStartedPulling="2026-04-22 19:33:49.54716639 +0000 UTC m=+632.352007672" lastFinishedPulling="2026-04-22 19:34:13.299355465 +0000 UTC m=+656.104196752" observedRunningTime="2026-04-22 19:34:13.86263919 +0000 UTC m=+656.667480493" watchObservedRunningTime="2026-04-22 19:34:13.86440624 +0000 UTC m=+656.669247544" Apr 22 19:34:14.843274 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:14.843246 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" Apr 22 19:34:14.843705 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:14.843407 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:34:14.844459 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:14.844432 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:34:15.846716 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:15.846660 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:34:15.847133 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:15.846999 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:34:25.847649 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:25.847593 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:34:25.848137 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:25.848112 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:34:35.847571 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:35.847520 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:34:35.848008 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:35.847973 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:34:45.847478 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:45.847430 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:34:45.847990 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:45.847884 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:34:55.847444 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:55.847395 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:34:55.847900 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:34:55.847829 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:35:05.847700 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:05.847590 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:35:05.848133 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:05.848073 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:35:15.847439 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:15.847385 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:35:15.848012 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:15.847983 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:35:25.847654 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:25.847622 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" Apr 22 19:35:25.848200 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:25.847967 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" Apr 22 19:35:34.146190 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.146151 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq"] Apr 22 19:35:34.146626 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.146481 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" containerID="cri-o://af9bc7b3387a948ce0886146ab185512639e33062dcbb8db9e7261f5febe4a92" gracePeriod=30 Apr 22 19:35:34.146867 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.146784 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" containerID="cri-o://0cab98a80987efc31388db9f4fecb84060ba47fbbccb03d4078055bb32c1d3e1" gracePeriod=30 Apr 22 19:35:34.280287 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.280250 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf"] Apr 22 19:35:34.282600 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.282580 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" Apr 22 19:35:34.309266 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.309238 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf"] Apr 22 19:35:34.392974 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.392940 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs"] Apr 22 19:35:34.395489 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.395468 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" Apr 22 19:35:34.420812 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.420718 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ebb5400-0432-4003-89da-f25fa6c3abe1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf\" (UID: \"8ebb5400-0432-4003-89da-f25fa6c3abe1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" Apr 22 19:35:34.423119 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.423095 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs"] Apr 22 19:35:34.521689 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.521653 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ebb5400-0432-4003-89da-f25fa6c3abe1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf\" (UID: \"8ebb5400-0432-4003-89da-f25fa6c3abe1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" Apr 22 19:35:34.521898 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.521753 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65e44826-b475-4bb3-b01c-33b2d1d76c7d-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs\" (UID: \"65e44826-b475-4bb3-b01c-33b2d1d76c7d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" Apr 22 19:35:34.522159 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.522135 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ebb5400-0432-4003-89da-f25fa6c3abe1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf\" (UID: \"8ebb5400-0432-4003-89da-f25fa6c3abe1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" Apr 22 19:35:34.592670 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.592630 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" Apr 22 19:35:34.622933 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.622895 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65e44826-b475-4bb3-b01c-33b2d1d76c7d-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs\" (UID: \"65e44826-b475-4bb3-b01c-33b2d1d76c7d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" Apr 22 19:35:34.623265 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.623246 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65e44826-b475-4bb3-b01c-33b2d1d76c7d-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs\" (UID: \"65e44826-b475-4bb3-b01c-33b2d1d76c7d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" Apr 22 19:35:34.705956 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.705924 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" Apr 22 19:35:34.721333 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.721308 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf"] Apr 22 19:35:34.724341 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:35:34.724311 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ebb5400_0432_4003_89da_f25fa6c3abe1.slice/crio-50ac5a450acddc841a05565943f3335258ff72e311242b6ae555c452736412c7 WatchSource:0}: Error finding container 50ac5a450acddc841a05565943f3335258ff72e311242b6ae555c452736412c7: Status 404 returned error can't find the container with id 50ac5a450acddc841a05565943f3335258ff72e311242b6ae555c452736412c7 Apr 22 19:35:34.865687 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:34.865653 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs"] Apr 22 19:35:34.866057 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:35:34.866026 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65e44826_b475_4bb3_b01c_33b2d1d76c7d.slice/crio-ddf893a527738aa5489d546d728d56979e22eeacb8be7944a0b0dbcaa858cc88 WatchSource:0}: Error finding container ddf893a527738aa5489d546d728d56979e22eeacb8be7944a0b0dbcaa858cc88: Status 404 returned error can't find the container with id ddf893a527738aa5489d546d728d56979e22eeacb8be7944a0b0dbcaa858cc88 Apr 22 19:35:35.095024 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:35.094979 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" event={"ID":"8ebb5400-0432-4003-89da-f25fa6c3abe1","Type":"ContainerStarted","Data":"c1789fdc81c61e1efc0b9400fcaa85aeb30abdae6a7fd02680feff4a974db61e"} Apr 22 19:35:35.095024 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:35.095027 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" event={"ID":"8ebb5400-0432-4003-89da-f25fa6c3abe1","Type":"ContainerStarted","Data":"50ac5a450acddc841a05565943f3335258ff72e311242b6ae555c452736412c7"} Apr 22 19:35:35.096646 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:35.096614 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" event={"ID":"65e44826-b475-4bb3-b01c-33b2d1d76c7d","Type":"ContainerStarted","Data":"77b5b8af10ae8ad55a9541795bed224067ae2f8e9e914e8a0c5e534a656674af"} Apr 22 19:35:35.096646 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:35.096651 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" event={"ID":"65e44826-b475-4bb3-b01c-33b2d1d76c7d","Type":"ContainerStarted","Data":"ddf893a527738aa5489d546d728d56979e22eeacb8be7944a0b0dbcaa858cc88"} Apr 22 19:35:35.847379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:35.847329 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:35:35.847846 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:35.847674 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:35:39.110779 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:39.110746 2579 generic.go:358] "Generic (PLEG): container finished" podID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerID="af9bc7b3387a948ce0886146ab185512639e33062dcbb8db9e7261f5febe4a92" exitCode=0 Apr 22 19:35:39.111235 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:39.110801 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" event={"ID":"6a215e8e-0981-4f2f-bc93-4abe72b71b8f","Type":"ContainerDied","Data":"af9bc7b3387a948ce0886146ab185512639e33062dcbb8db9e7261f5febe4a92"} Apr 22 19:35:39.112136 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:39.112118 2579 generic.go:358] "Generic (PLEG): container finished" podID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerID="c1789fdc81c61e1efc0b9400fcaa85aeb30abdae6a7fd02680feff4a974db61e" exitCode=0 Apr 22 19:35:39.112230 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:39.112176 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" event={"ID":"8ebb5400-0432-4003-89da-f25fa6c3abe1","Type":"ContainerDied","Data":"c1789fdc81c61e1efc0b9400fcaa85aeb30abdae6a7fd02680feff4a974db61e"} Apr 22 19:35:39.113567 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:39.113541 2579 generic.go:358] "Generic (PLEG): container finished" podID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerID="77b5b8af10ae8ad55a9541795bed224067ae2f8e9e914e8a0c5e534a656674af" exitCode=0 Apr 22 19:35:39.113684 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:39.113567 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" event={"ID":"65e44826-b475-4bb3-b01c-33b2d1d76c7d","Type":"ContainerDied","Data":"77b5b8af10ae8ad55a9541795bed224067ae2f8e9e914e8a0c5e534a656674af"} Apr 22 19:35:40.120838 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:40.120801 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" event={"ID":"8ebb5400-0432-4003-89da-f25fa6c3abe1","Type":"ContainerStarted","Data":"2171f008fc4ca4dd54c09f7c08f18138c0b2bc9fad0ef1e862c9aeb275ee4611"} Apr 22 19:35:40.121351 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:40.121151 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" Apr 22 19:35:40.122824 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:40.122776 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:35:40.139827 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:40.139754 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" podStartSLOduration=6.139720126 podStartE2EDuration="6.139720126s" podCreationTimestamp="2026-04-22 19:35:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:35:40.137675798 +0000 UTC m=+742.942517123" watchObservedRunningTime="2026-04-22 19:35:40.139720126 +0000 UTC m=+742.944561430" Apr 22 19:35:41.126053 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:41.125999 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:35:45.847474 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:45.847382 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:35:45.847956 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:45.847766 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:35:51.127238 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:51.127181 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:35:55.846886 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:55.846834 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 19:35:55.847405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:55.846995 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" Apr 22 19:35:55.847405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:55.847221 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:35:55.847405 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:55.847334 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" Apr 22 19:35:59.192376 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:59.192341 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" event={"ID":"65e44826-b475-4bb3-b01c-33b2d1d76c7d","Type":"ContainerStarted","Data":"1623532e92447d2f01e4bb2c16d589daaa50e2748e0cac1f5d04f9bf8f25d994"} Apr 22 19:35:59.192799 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:59.192620 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" Apr 22 19:35:59.193995 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:59.193970 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:35:59.212317 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:35:59.212266 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" podStartSLOduration=5.711984542 podStartE2EDuration="25.21225337s" podCreationTimestamp="2026-04-22 19:35:34 +0000 UTC" firstStartedPulling="2026-04-22 19:35:39.114932211 +0000 UTC m=+741.919773494" lastFinishedPulling="2026-04-22 19:35:58.615201039 +0000 UTC m=+761.420042322" observedRunningTime="2026-04-22 19:35:59.209903011 +0000 UTC m=+762.014744341" watchObservedRunningTime="2026-04-22 19:35:59.21225337 +0000 UTC m=+762.017094674" Apr 22 19:36:00.195974 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:00.195936 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:36:01.126588 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:01.126541 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:36:04.809350 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:04.809325 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" Apr 22 19:36:04.903326 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:04.903290 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a215e8e-0981-4f2f-bc93-4abe72b71b8f-kserve-provision-location\") pod \"6a215e8e-0981-4f2f-bc93-4abe72b71b8f\" (UID: \"6a215e8e-0981-4f2f-bc93-4abe72b71b8f\") " Apr 22 19:36:04.903648 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:04.903621 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a215e8e-0981-4f2f-bc93-4abe72b71b8f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6a215e8e-0981-4f2f-bc93-4abe72b71b8f" (UID: "6a215e8e-0981-4f2f-bc93-4abe72b71b8f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:36:05.004969 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.004929 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a215e8e-0981-4f2f-bc93-4abe72b71b8f-kserve-provision-location\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:36:05.212901 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.212865 2579 generic.go:358] "Generic (PLEG): container finished" podID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerID="0cab98a80987efc31388db9f4fecb84060ba47fbbccb03d4078055bb32c1d3e1" exitCode=0 Apr 22 19:36:05.213073 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.212924 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" event={"ID":"6a215e8e-0981-4f2f-bc93-4abe72b71b8f","Type":"ContainerDied","Data":"0cab98a80987efc31388db9f4fecb84060ba47fbbccb03d4078055bb32c1d3e1"} Apr 22 19:36:05.213073 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.212952 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" Apr 22 19:36:05.213073 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.212959 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq" event={"ID":"6a215e8e-0981-4f2f-bc93-4abe72b71b8f","Type":"ContainerDied","Data":"f05d6466545be779f4844441f624560b45c888e495576839b149691608986abb"} Apr 22 19:36:05.213073 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.212980 2579 scope.go:117] "RemoveContainer" containerID="0cab98a80987efc31388db9f4fecb84060ba47fbbccb03d4078055bb32c1d3e1" Apr 22 19:36:05.221572 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.221549 2579 scope.go:117] "RemoveContainer" containerID="af9bc7b3387a948ce0886146ab185512639e33062dcbb8db9e7261f5febe4a92" Apr 22 19:36:05.230539 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.230516 2579 scope.go:117] "RemoveContainer" containerID="ac40004501eca68c399f11efd95db198fbff751aecdbc56005249f0c54c49096" Apr 22 19:36:05.236866 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.236842 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq"] Apr 22 19:36:05.239073 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.239047 2579 scope.go:117] "RemoveContainer" containerID="0cab98a80987efc31388db9f4fecb84060ba47fbbccb03d4078055bb32c1d3e1" Apr 22 19:36:05.239386 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:36:05.239365 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cab98a80987efc31388db9f4fecb84060ba47fbbccb03d4078055bb32c1d3e1\": container with ID starting with 0cab98a80987efc31388db9f4fecb84060ba47fbbccb03d4078055bb32c1d3e1 not found: ID does not exist" containerID="0cab98a80987efc31388db9f4fecb84060ba47fbbccb03d4078055bb32c1d3e1" Apr 22 19:36:05.239472 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.239394 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cab98a80987efc31388db9f4fecb84060ba47fbbccb03d4078055bb32c1d3e1"} err="failed to get container status \"0cab98a80987efc31388db9f4fecb84060ba47fbbccb03d4078055bb32c1d3e1\": rpc error: code = NotFound desc = could not find container \"0cab98a80987efc31388db9f4fecb84060ba47fbbccb03d4078055bb32c1d3e1\": container with ID starting with 0cab98a80987efc31388db9f4fecb84060ba47fbbccb03d4078055bb32c1d3e1 not found: ID does not exist" Apr 22 19:36:05.239472 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.239414 2579 scope.go:117] "RemoveContainer" containerID="af9bc7b3387a948ce0886146ab185512639e33062dcbb8db9e7261f5febe4a92" Apr 22 19:36:05.239698 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:36:05.239669 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af9bc7b3387a948ce0886146ab185512639e33062dcbb8db9e7261f5febe4a92\": container with ID starting with af9bc7b3387a948ce0886146ab185512639e33062dcbb8db9e7261f5febe4a92 not found: ID does not exist" containerID="af9bc7b3387a948ce0886146ab185512639e33062dcbb8db9e7261f5febe4a92" Apr 22 19:36:05.239826 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.239707 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9bc7b3387a948ce0886146ab185512639e33062dcbb8db9e7261f5febe4a92"} err="failed to get container status \"af9bc7b3387a948ce0886146ab185512639e33062dcbb8db9e7261f5febe4a92\": rpc error: code = NotFound desc = could not find container \"af9bc7b3387a948ce0886146ab185512639e33062dcbb8db9e7261f5febe4a92\": container with ID starting with af9bc7b3387a948ce0886146ab185512639e33062dcbb8db9e7261f5febe4a92 not found: ID does not exist" Apr 22 19:36:05.239826 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.239754 2579 scope.go:117] "RemoveContainer" containerID="ac40004501eca68c399f11efd95db198fbff751aecdbc56005249f0c54c49096" Apr 22 19:36:05.240029 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:36:05.240012 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac40004501eca68c399f11efd95db198fbff751aecdbc56005249f0c54c49096\": container with ID starting with ac40004501eca68c399f11efd95db198fbff751aecdbc56005249f0c54c49096 not found: ID does not exist" containerID="ac40004501eca68c399f11efd95db198fbff751aecdbc56005249f0c54c49096" Apr 22 19:36:05.240087 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.240046 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac40004501eca68c399f11efd95db198fbff751aecdbc56005249f0c54c49096"} err="failed to get container status \"ac40004501eca68c399f11efd95db198fbff751aecdbc56005249f0c54c49096\": rpc error: code = NotFound desc = could not find container \"ac40004501eca68c399f11efd95db198fbff751aecdbc56005249f0c54c49096\": container with ID starting with ac40004501eca68c399f11efd95db198fbff751aecdbc56005249f0c54c49096 not found: ID does not exist" Apr 22 19:36:05.240692 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.240671 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5fa8b-predictor-6b99545d86-xj7nq"] Apr 22 19:36:05.805562 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:05.805522 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" path="/var/lib/kubelet/pods/6a215e8e-0981-4f2f-bc93-4abe72b71b8f/volumes" Apr 22 19:36:10.196584 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:10.196529 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:36:11.126970 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:11.126910 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:36:20.196353 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:20.196304 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:36:21.127079 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:21.127033 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:36:30.196444 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:30.196389 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:36:31.126345 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:31.126299 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:36:40.196448 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:40.196350 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:36:41.126926 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:41.126877 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:36:48.802896 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:48.802864 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" Apr 22 19:36:50.196109 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:36:50.196059 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 22 19:37:00.197872 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:00.197830 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" Apr 22 19:37:14.477879 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.477844 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf"] Apr 22 19:37:14.478831 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.478250 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="kserve-container" containerID="cri-o://2171f008fc4ca4dd54c09f7c08f18138c0b2bc9fad0ef1e862c9aeb275ee4611" gracePeriod=30 Apr 22 19:37:14.537841 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.537809 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc"] Apr 22 19:37:14.538197 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.538177 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" Apr 22 19:37:14.538197 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.538192 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" Apr 22 19:37:14.538390 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.538208 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="storage-initializer" Apr 22 19:37:14.538390 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.538213 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="storage-initializer" Apr 22 19:37:14.538390 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.538222 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" Apr 22 19:37:14.538390 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.538228 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" Apr 22 19:37:14.538390 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.538279 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="kserve-container" Apr 22 19:37:14.538390 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.538291 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a215e8e-0981-4f2f-bc93-4abe72b71b8f" containerName="agent" Apr 22 19:37:14.540364 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.540345 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" Apr 22 19:37:14.552954 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.552925 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc"] Apr 22 19:37:14.599815 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.599778 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp"] Apr 22 19:37:14.602117 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.602095 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" Apr 22 19:37:14.619848 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.619818 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp"] Apr 22 19:37:14.656689 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.656661 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs"] Apr 22 19:37:14.656947 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.656926 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerName="kserve-container" containerID="cri-o://1623532e92447d2f01e4bb2c16d589daaa50e2748e0cac1f5d04f9bf8f25d994" gracePeriod=30 Apr 22 19:37:14.682799 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.682757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f536e031-82ee-46e3-8ca7-d53ad0285015-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc\" (UID: \"f536e031-82ee-46e3-8ca7-d53ad0285015\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" Apr 22 19:37:14.682942 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.682816 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b831095f-5903-4cd1-9637-21c2e4549f25-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp\" (UID: \"b831095f-5903-4cd1-9637-21c2e4549f25\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" Apr 22 19:37:14.784060 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.783973 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f536e031-82ee-46e3-8ca7-d53ad0285015-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc\" (UID: \"f536e031-82ee-46e3-8ca7-d53ad0285015\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" Apr 22 19:37:14.784060 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.784020 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b831095f-5903-4cd1-9637-21c2e4549f25-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp\" (UID: \"b831095f-5903-4cd1-9637-21c2e4549f25\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" Apr 22 19:37:14.784470 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.784447 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b831095f-5903-4cd1-9637-21c2e4549f25-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp\" (UID: \"b831095f-5903-4cd1-9637-21c2e4549f25\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" Apr 22 19:37:14.784510 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.784447 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f536e031-82ee-46e3-8ca7-d53ad0285015-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc\" (UID: \"f536e031-82ee-46e3-8ca7-d53ad0285015\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" Apr 22 19:37:14.852338 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.852303 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" Apr 22 19:37:14.911370 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.911336 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" Apr 22 19:37:14.985554 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:14.985525 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc"] Apr 22 19:37:14.988936 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:37:14.988906 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf536e031_82ee_46e3_8ca7_d53ad0285015.slice/crio-7ac6ec25dac0d0f778166d0caf440ca7ef1e6ae60efbde590ca08043f9b8eca8 WatchSource:0}: Error finding container 7ac6ec25dac0d0f778166d0caf440ca7ef1e6ae60efbde590ca08043f9b8eca8: Status 404 returned error can't find the container with id 7ac6ec25dac0d0f778166d0caf440ca7ef1e6ae60efbde590ca08043f9b8eca8 Apr 22 19:37:15.052768 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:15.052716 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp"] Apr 22 19:37:15.055751 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:37:15.055710 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb831095f_5903_4cd1_9637_21c2e4549f25.slice/crio-1f871fbbbb59f4d82b4b99c6e70c4230de36a7b65735a3875590f38476181bdc WatchSource:0}: Error finding container 1f871fbbbb59f4d82b4b99c6e70c4230de36a7b65735a3875590f38476181bdc: Status 404 returned error can't find the container with id 1f871fbbbb59f4d82b4b99c6e70c4230de36a7b65735a3875590f38476181bdc Apr 22 19:37:15.437255 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:15.437214 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" event={"ID":"f536e031-82ee-46e3-8ca7-d53ad0285015","Type":"ContainerStarted","Data":"a51d5671e9a2e370039ba13ade1418f91dce9bc8723219c7dc98435713a7cb2e"} Apr 22 19:37:15.437473 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:15.437264 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" event={"ID":"f536e031-82ee-46e3-8ca7-d53ad0285015","Type":"ContainerStarted","Data":"7ac6ec25dac0d0f778166d0caf440ca7ef1e6ae60efbde590ca08043f9b8eca8"} Apr 22 19:37:15.438772 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:15.438719 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" event={"ID":"b831095f-5903-4cd1-9637-21c2e4549f25","Type":"ContainerStarted","Data":"a271a65d0a4c2dbd42eaf17541a971cef67dd7dbd4f637730d1abbb50f9bcb90"} Apr 22 19:37:15.438772 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:15.438774 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" event={"ID":"b831095f-5903-4cd1-9637-21c2e4549f25","Type":"ContainerStarted","Data":"1f871fbbbb59f4d82b4b99c6e70c4230de36a7b65735a3875590f38476181bdc"} Apr 22 19:37:18.452290 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:18.452255 2579 generic.go:358] "Generic (PLEG): container finished" podID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerID="1623532e92447d2f01e4bb2c16d589daaa50e2748e0cac1f5d04f9bf8f25d994" exitCode=0 Apr 22 19:37:18.452762 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:18.452323 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" event={"ID":"65e44826-b475-4bb3-b01c-33b2d1d76c7d","Type":"ContainerDied","Data":"1623532e92447d2f01e4bb2c16d589daaa50e2748e0cac1f5d04f9bf8f25d994"} Apr 22 19:37:18.501274 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:18.501249 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" Apr 22 19:37:18.616141 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:18.616097 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65e44826-b475-4bb3-b01c-33b2d1d76c7d-kserve-provision-location\") pod \"65e44826-b475-4bb3-b01c-33b2d1d76c7d\" (UID: \"65e44826-b475-4bb3-b01c-33b2d1d76c7d\") " Apr 22 19:37:18.616450 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:18.616424 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e44826-b475-4bb3-b01c-33b2d1d76c7d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "65e44826-b475-4bb3-b01c-33b2d1d76c7d" (UID: "65e44826-b475-4bb3-b01c-33b2d1d76c7d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:37:18.717389 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:18.717346 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65e44826-b475-4bb3-b01c-33b2d1d76c7d-kserve-provision-location\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:37:18.802742 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:18.802695 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 22 19:37:19.002597 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.002569 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" Apr 22 19:37:19.120360 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.120319 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ebb5400-0432-4003-89da-f25fa6c3abe1-kserve-provision-location\") pod \"8ebb5400-0432-4003-89da-f25fa6c3abe1\" (UID: \"8ebb5400-0432-4003-89da-f25fa6c3abe1\") " Apr 22 19:37:19.120701 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.120673 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ebb5400-0432-4003-89da-f25fa6c3abe1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8ebb5400-0432-4003-89da-f25fa6c3abe1" (UID: "8ebb5400-0432-4003-89da-f25fa6c3abe1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:37:19.221214 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.221126 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ebb5400-0432-4003-89da-f25fa6c3abe1-kserve-provision-location\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:37:19.457348 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.457313 2579 generic.go:358] "Generic (PLEG): container finished" podID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerID="2171f008fc4ca4dd54c09f7c08f18138c0b2bc9fad0ef1e862c9aeb275ee4611" exitCode=0 Apr 22 19:37:19.457772 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.457383 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" Apr 22 19:37:19.457772 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.457405 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" event={"ID":"8ebb5400-0432-4003-89da-f25fa6c3abe1","Type":"ContainerDied","Data":"2171f008fc4ca4dd54c09f7c08f18138c0b2bc9fad0ef1e862c9aeb275ee4611"} Apr 22 19:37:19.457772 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.457450 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf" event={"ID":"8ebb5400-0432-4003-89da-f25fa6c3abe1","Type":"ContainerDied","Data":"50ac5a450acddc841a05565943f3335258ff72e311242b6ae555c452736412c7"} Apr 22 19:37:19.457772 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.457469 2579 scope.go:117] "RemoveContainer" containerID="2171f008fc4ca4dd54c09f7c08f18138c0b2bc9fad0ef1e862c9aeb275ee4611" Apr 22 19:37:19.459109 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.459089 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" Apr 22 19:37:19.459109 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.459089 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs" event={"ID":"65e44826-b475-4bb3-b01c-33b2d1d76c7d","Type":"ContainerDied","Data":"ddf893a527738aa5489d546d728d56979e22eeacb8be7944a0b0dbcaa858cc88"} Apr 22 19:37:19.460622 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.460594 2579 generic.go:358] "Generic (PLEG): container finished" podID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerID="a51d5671e9a2e370039ba13ade1418f91dce9bc8723219c7dc98435713a7cb2e" exitCode=0 Apr 22 19:37:19.460751 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.460663 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" event={"ID":"f536e031-82ee-46e3-8ca7-d53ad0285015","Type":"ContainerDied","Data":"a51d5671e9a2e370039ba13ade1418f91dce9bc8723219c7dc98435713a7cb2e"} Apr 22 19:37:19.462273 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.462250 2579 generic.go:358] "Generic (PLEG): container finished" podID="b831095f-5903-4cd1-9637-21c2e4549f25" containerID="a271a65d0a4c2dbd42eaf17541a971cef67dd7dbd4f637730d1abbb50f9bcb90" exitCode=0 Apr 22 19:37:19.462368 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.462301 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" event={"ID":"b831095f-5903-4cd1-9637-21c2e4549f25","Type":"ContainerDied","Data":"a271a65d0a4c2dbd42eaf17541a971cef67dd7dbd4f637730d1abbb50f9bcb90"} Apr 22 19:37:19.467549 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.467537 2579 scope.go:117] "RemoveContainer" containerID="c1789fdc81c61e1efc0b9400fcaa85aeb30abdae6a7fd02680feff4a974db61e" Apr 22 19:37:19.475681 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.475656 2579 scope.go:117] "RemoveContainer" containerID="2171f008fc4ca4dd54c09f7c08f18138c0b2bc9fad0ef1e862c9aeb275ee4611" Apr 22 19:37:19.476100 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:37:19.476081 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2171f008fc4ca4dd54c09f7c08f18138c0b2bc9fad0ef1e862c9aeb275ee4611\": container with ID starting with 2171f008fc4ca4dd54c09f7c08f18138c0b2bc9fad0ef1e862c9aeb275ee4611 not found: ID does not exist" containerID="2171f008fc4ca4dd54c09f7c08f18138c0b2bc9fad0ef1e862c9aeb275ee4611" Apr 22 19:37:19.476169 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.476110 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2171f008fc4ca4dd54c09f7c08f18138c0b2bc9fad0ef1e862c9aeb275ee4611"} err="failed to get container status \"2171f008fc4ca4dd54c09f7c08f18138c0b2bc9fad0ef1e862c9aeb275ee4611\": rpc error: code = NotFound desc = could not find container \"2171f008fc4ca4dd54c09f7c08f18138c0b2bc9fad0ef1e862c9aeb275ee4611\": container with ID starting with 2171f008fc4ca4dd54c09f7c08f18138c0b2bc9fad0ef1e862c9aeb275ee4611 not found: ID does not exist" Apr 22 19:37:19.476169 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.476128 2579 scope.go:117] "RemoveContainer" containerID="c1789fdc81c61e1efc0b9400fcaa85aeb30abdae6a7fd02680feff4a974db61e" Apr 22 19:37:19.476362 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:37:19.476339 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1789fdc81c61e1efc0b9400fcaa85aeb30abdae6a7fd02680feff4a974db61e\": container with ID starting with c1789fdc81c61e1efc0b9400fcaa85aeb30abdae6a7fd02680feff4a974db61e not found: ID does not exist" containerID="c1789fdc81c61e1efc0b9400fcaa85aeb30abdae6a7fd02680feff4a974db61e" Apr 22 19:37:19.476460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.476367 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1789fdc81c61e1efc0b9400fcaa85aeb30abdae6a7fd02680feff4a974db61e"} err="failed to get container status \"c1789fdc81c61e1efc0b9400fcaa85aeb30abdae6a7fd02680feff4a974db61e\": rpc error: code = NotFound desc = could not find container \"c1789fdc81c61e1efc0b9400fcaa85aeb30abdae6a7fd02680feff4a974db61e\": container with ID starting with c1789fdc81c61e1efc0b9400fcaa85aeb30abdae6a7fd02680feff4a974db61e not found: ID does not exist" Apr 22 19:37:19.476460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.476383 2579 scope.go:117] "RemoveContainer" containerID="1623532e92447d2f01e4bb2c16d589daaa50e2748e0cac1f5d04f9bf8f25d994" Apr 22 19:37:19.497934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.497910 2579 scope.go:117] "RemoveContainer" containerID="77b5b8af10ae8ad55a9541795bed224067ae2f8e9e914e8a0c5e534a656674af" Apr 22 19:37:19.518664 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.518637 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf"] Apr 22 19:37:19.522799 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.522774 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-8e08c-predictor-5985d486cb-722tf"] Apr 22 19:37:19.534864 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.534839 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs"] Apr 22 19:37:19.541103 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.541078 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-8e08c-predictor-7b9b79c65f-ssshs"] Apr 22 19:37:19.805133 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.805047 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" path="/var/lib/kubelet/pods/65e44826-b475-4bb3-b01c-33b2d1d76c7d/volumes" Apr 22 19:37:19.805409 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:19.805392 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" path="/var/lib/kubelet/pods/8ebb5400-0432-4003-89da-f25fa6c3abe1/volumes" Apr 22 19:37:20.467798 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:20.467761 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" event={"ID":"b831095f-5903-4cd1-9637-21c2e4549f25","Type":"ContainerStarted","Data":"25a5070274c00c11dd3b584919885892e7ba93bedaac10fc378f79762d002f8c"} Apr 22 19:37:20.468255 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:20.468160 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" Apr 22 19:37:20.469355 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:20.469321 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:37:20.470949 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:20.470928 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" event={"ID":"f536e031-82ee-46e3-8ca7-d53ad0285015","Type":"ContainerStarted","Data":"236a4b2db3a9cdb8c1ded9e3fd6046909b9655b8091472ea654d65cea0289a35"} Apr 22 19:37:20.471206 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:20.471190 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" Apr 22 19:37:20.472317 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:20.472298 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:37:20.485187 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:20.485137 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" podStartSLOduration=6.485122752 podStartE2EDuration="6.485122752s" podCreationTimestamp="2026-04-22 19:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:37:20.484447475 +0000 UTC m=+843.289288782" watchObservedRunningTime="2026-04-22 19:37:20.485122752 +0000 UTC m=+843.289964056" Apr 22 19:37:20.501014 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:20.500965 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" podStartSLOduration=6.500952031 podStartE2EDuration="6.500952031s" podCreationTimestamp="2026-04-22 19:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:37:20.499717922 +0000 UTC m=+843.304559247" watchObservedRunningTime="2026-04-22 19:37:20.500952031 +0000 UTC m=+843.305793336" Apr 22 19:37:21.474354 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:21.474312 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:37:21.474751 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:21.474324 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:37:31.474843 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:31.474790 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:37:31.475251 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:31.474798 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:37:41.475272 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:41.475228 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:37:41.475647 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:41.475220 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:37:51.475139 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:51.475098 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:37:51.475612 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:37:51.475092 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:38:01.474967 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:38:01.474912 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:38:01.475397 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:38:01.474912 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:38:11.474846 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:38:11.474722 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:38:11.475292 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:38:11.474722 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 22 19:38:17.736183 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:38:17.736152 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:38:17.737685 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:38:17.737664 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:38:21.474664 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:38:21.474620 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:38:21.475806 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:38:21.475779 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" Apr 22 19:38:24.801360 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:38:24.801318 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 22 19:38:34.802939 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:38:34.802902 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" Apr 22 19:39:04.785541 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.785508 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc"] Apr 22 19:39:04.786933 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.785850 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="kserve-container" containerID="cri-o://236a4b2db3a9cdb8c1ded9e3fd6046909b9655b8091472ea654d65cea0289a35" gracePeriod=30 Apr 22 19:39:04.822508 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.822474 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8"] Apr 22 19:39:04.822934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.822917 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="kserve-container" Apr 22 19:39:04.822934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.822935 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="kserve-container" Apr 22 19:39:04.823120 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.822947 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerName="kserve-container" Apr 22 19:39:04.823120 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.822953 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerName="kserve-container" Apr 22 19:39:04.823120 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.822969 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="storage-initializer" Apr 22 19:39:04.823120 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.822976 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="storage-initializer" Apr 22 19:39:04.823120 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.822996 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerName="storage-initializer" Apr 22 19:39:04.823120 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.823002 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerName="storage-initializer" Apr 22 19:39:04.823120 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.823058 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="65e44826-b475-4bb3-b01c-33b2d1d76c7d" containerName="kserve-container" Apr 22 19:39:04.823120 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.823070 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ebb5400-0432-4003-89da-f25fa6c3abe1" containerName="kserve-container" Apr 22 19:39:04.824836 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.824820 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8" Apr 22 19:39:04.834716 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.834690 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8"] Apr 22 19:39:04.835293 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.835278 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8" Apr 22 19:39:04.910897 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.910098 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp"] Apr 22 19:39:04.910897 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.910465 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" containerName="kserve-container" containerID="cri-o://25a5070274c00c11dd3b584919885892e7ba93bedaac10fc378f79762d002f8c" gracePeriod=30 Apr 22 19:39:04.978366 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.978341 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8"] Apr 22 19:39:04.981112 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:39:04.981075 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfecab682_dbf8_473b_b88e_8227390b7c39.slice/crio-71f9dc7e85fa9fa7220961bc5be3ca5837802e5a002f33778481fa9e4d0cdef9 WatchSource:0}: Error finding container 71f9dc7e85fa9fa7220961bc5be3ca5837802e5a002f33778481fa9e4d0cdef9: Status 404 returned error can't find the container with id 71f9dc7e85fa9fa7220961bc5be3ca5837802e5a002f33778481fa9e4d0cdef9 Apr 22 19:39:04.982862 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:04.982845 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:39:05.823695 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:05.823657 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8" event={"ID":"fecab682-dbf8-473b-b88e-8227390b7c39","Type":"ContainerStarted","Data":"71f9dc7e85fa9fa7220961bc5be3ca5837802e5a002f33778481fa9e4d0cdef9"} Apr 22 19:39:06.828864 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:06.828824 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8" event={"ID":"fecab682-dbf8-473b-b88e-8227390b7c39","Type":"ContainerStarted","Data":"d1bad8663b207f75269c04e4075b534a93d0bf4f7df6343b25c27083bc74fd04"} Apr 22 19:39:06.829285 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:06.829063 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8" Apr 22 19:39:06.831038 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:06.831017 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8" Apr 22 19:39:06.844640 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:06.844592 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8" podStartSLOduration=1.8392822880000002 podStartE2EDuration="2.844579548s" podCreationTimestamp="2026-04-22 19:39:04 +0000 UTC" firstStartedPulling="2026-04-22 19:39:04.982975085 +0000 UTC m=+947.787816367" lastFinishedPulling="2026-04-22 19:39:05.988272344 +0000 UTC m=+948.793113627" observedRunningTime="2026-04-22 19:39:06.844377416 +0000 UTC m=+949.649218725" watchObservedRunningTime="2026-04-22 19:39:06.844579548 +0000 UTC m=+949.649420934" Apr 22 19:39:08.755960 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.755937 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" Apr 22 19:39:08.837206 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.837119 2579 generic.go:358] "Generic (PLEG): container finished" podID="b831095f-5903-4cd1-9637-21c2e4549f25" containerID="25a5070274c00c11dd3b584919885892e7ba93bedaac10fc378f79762d002f8c" exitCode=0 Apr 22 19:39:08.837206 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.837185 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" Apr 22 19:39:08.837379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.837206 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" event={"ID":"b831095f-5903-4cd1-9637-21c2e4549f25","Type":"ContainerDied","Data":"25a5070274c00c11dd3b584919885892e7ba93bedaac10fc378f79762d002f8c"} Apr 22 19:39:08.837379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.837249 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp" event={"ID":"b831095f-5903-4cd1-9637-21c2e4549f25","Type":"ContainerDied","Data":"1f871fbbbb59f4d82b4b99c6e70c4230de36a7b65735a3875590f38476181bdc"} Apr 22 19:39:08.837379 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.837267 2579 scope.go:117] "RemoveContainer" containerID="25a5070274c00c11dd3b584919885892e7ba93bedaac10fc378f79762d002f8c" Apr 22 19:39:08.845722 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.845701 2579 scope.go:117] "RemoveContainer" containerID="a271a65d0a4c2dbd42eaf17541a971cef67dd7dbd4f637730d1abbb50f9bcb90" Apr 22 19:39:08.852956 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.852929 2579 scope.go:117] "RemoveContainer" containerID="25a5070274c00c11dd3b584919885892e7ba93bedaac10fc378f79762d002f8c" Apr 22 19:39:08.853210 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:39:08.853193 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a5070274c00c11dd3b584919885892e7ba93bedaac10fc378f79762d002f8c\": container with ID starting with 25a5070274c00c11dd3b584919885892e7ba93bedaac10fc378f79762d002f8c not found: ID does not exist" containerID="25a5070274c00c11dd3b584919885892e7ba93bedaac10fc378f79762d002f8c" Apr 22 19:39:08.853259 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.853218 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a5070274c00c11dd3b584919885892e7ba93bedaac10fc378f79762d002f8c"} err="failed to get container status \"25a5070274c00c11dd3b584919885892e7ba93bedaac10fc378f79762d002f8c\": rpc error: code = NotFound desc = could not find container \"25a5070274c00c11dd3b584919885892e7ba93bedaac10fc378f79762d002f8c\": container with ID starting with 25a5070274c00c11dd3b584919885892e7ba93bedaac10fc378f79762d002f8c not found: ID does not exist" Apr 22 19:39:08.853259 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.853237 2579 scope.go:117] "RemoveContainer" containerID="a271a65d0a4c2dbd42eaf17541a971cef67dd7dbd4f637730d1abbb50f9bcb90" Apr 22 19:39:08.853481 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:39:08.853456 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a271a65d0a4c2dbd42eaf17541a971cef67dd7dbd4f637730d1abbb50f9bcb90\": container with ID starting with a271a65d0a4c2dbd42eaf17541a971cef67dd7dbd4f637730d1abbb50f9bcb90 not found: ID does not exist" containerID="a271a65d0a4c2dbd42eaf17541a971cef67dd7dbd4f637730d1abbb50f9bcb90" Apr 22 19:39:08.853522 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.853488 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a271a65d0a4c2dbd42eaf17541a971cef67dd7dbd4f637730d1abbb50f9bcb90"} err="failed to get container status \"a271a65d0a4c2dbd42eaf17541a971cef67dd7dbd4f637730d1abbb50f9bcb90\": rpc error: code = NotFound desc = could not find container \"a271a65d0a4c2dbd42eaf17541a971cef67dd7dbd4f637730d1abbb50f9bcb90\": container with ID starting with a271a65d0a4c2dbd42eaf17541a971cef67dd7dbd4f637730d1abbb50f9bcb90 not found: ID does not exist" Apr 22 19:39:08.865483 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.865458 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b831095f-5903-4cd1-9637-21c2e4549f25-kserve-provision-location\") pod \"b831095f-5903-4cd1-9637-21c2e4549f25\" (UID: \"b831095f-5903-4cd1-9637-21c2e4549f25\") " Apr 22 19:39:08.865779 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.865752 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b831095f-5903-4cd1-9637-21c2e4549f25-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b831095f-5903-4cd1-9637-21c2e4549f25" (UID: "b831095f-5903-4cd1-9637-21c2e4549f25"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:08.967045 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:08.966990 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b831095f-5903-4cd1-9637-21c2e4549f25-kserve-provision-location\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:39:09.160866 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.160824 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp"] Apr 22 19:39:09.164113 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.164083 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ed925-predictor-55955498c5-sfmbp"] Apr 22 19:39:09.426937 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.426915 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" Apr 22 19:39:09.572655 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.572611 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f536e031-82ee-46e3-8ca7-d53ad0285015-kserve-provision-location\") pod \"f536e031-82ee-46e3-8ca7-d53ad0285015\" (UID: \"f536e031-82ee-46e3-8ca7-d53ad0285015\") " Apr 22 19:39:09.572998 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.572973 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f536e031-82ee-46e3-8ca7-d53ad0285015-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f536e031-82ee-46e3-8ca7-d53ad0285015" (UID: "f536e031-82ee-46e3-8ca7-d53ad0285015"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:09.673366 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.673271 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f536e031-82ee-46e3-8ca7-d53ad0285015-kserve-provision-location\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:39:09.805462 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.805424 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" path="/var/lib/kubelet/pods/b831095f-5903-4cd1-9637-21c2e4549f25/volumes" Apr 22 19:39:09.843533 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.843498 2579 generic.go:358] "Generic (PLEG): container finished" podID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerID="236a4b2db3a9cdb8c1ded9e3fd6046909b9655b8091472ea654d65cea0289a35" exitCode=0 Apr 22 19:39:09.843709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.843582 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" event={"ID":"f536e031-82ee-46e3-8ca7-d53ad0285015","Type":"ContainerDied","Data":"236a4b2db3a9cdb8c1ded9e3fd6046909b9655b8091472ea654d65cea0289a35"} Apr 22 19:39:09.843709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.843610 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" Apr 22 19:39:09.843709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.843628 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc" event={"ID":"f536e031-82ee-46e3-8ca7-d53ad0285015","Type":"ContainerDied","Data":"7ac6ec25dac0d0f778166d0caf440ca7ef1e6ae60efbde590ca08043f9b8eca8"} Apr 22 19:39:09.843709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.843650 2579 scope.go:117] "RemoveContainer" containerID="236a4b2db3a9cdb8c1ded9e3fd6046909b9655b8091472ea654d65cea0289a35" Apr 22 19:39:09.851523 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.851503 2579 scope.go:117] "RemoveContainer" containerID="a51d5671e9a2e370039ba13ade1418f91dce9bc8723219c7dc98435713a7cb2e" Apr 22 19:39:09.859318 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.859290 2579 scope.go:117] "RemoveContainer" containerID="236a4b2db3a9cdb8c1ded9e3fd6046909b9655b8091472ea654d65cea0289a35" Apr 22 19:39:09.859645 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:39:09.859618 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236a4b2db3a9cdb8c1ded9e3fd6046909b9655b8091472ea654d65cea0289a35\": container with ID starting with 236a4b2db3a9cdb8c1ded9e3fd6046909b9655b8091472ea654d65cea0289a35 not found: ID does not exist" containerID="236a4b2db3a9cdb8c1ded9e3fd6046909b9655b8091472ea654d65cea0289a35" Apr 22 19:39:09.859722 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.859655 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236a4b2db3a9cdb8c1ded9e3fd6046909b9655b8091472ea654d65cea0289a35"} err="failed to get container status \"236a4b2db3a9cdb8c1ded9e3fd6046909b9655b8091472ea654d65cea0289a35\": rpc error: code = NotFound desc = could not find container \"236a4b2db3a9cdb8c1ded9e3fd6046909b9655b8091472ea654d65cea0289a35\": container with ID starting with 236a4b2db3a9cdb8c1ded9e3fd6046909b9655b8091472ea654d65cea0289a35 not found: ID does not exist" Apr 22 19:39:09.859722 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.859676 2579 scope.go:117] "RemoveContainer" containerID="a51d5671e9a2e370039ba13ade1418f91dce9bc8723219c7dc98435713a7cb2e" Apr 22 19:39:09.859973 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:39:09.859950 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a51d5671e9a2e370039ba13ade1418f91dce9bc8723219c7dc98435713a7cb2e\": container with ID starting with a51d5671e9a2e370039ba13ade1418f91dce9bc8723219c7dc98435713a7cb2e not found: ID does not exist" containerID="a51d5671e9a2e370039ba13ade1418f91dce9bc8723219c7dc98435713a7cb2e" Apr 22 19:39:09.860023 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.859983 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a51d5671e9a2e370039ba13ade1418f91dce9bc8723219c7dc98435713a7cb2e"} err="failed to get container status \"a51d5671e9a2e370039ba13ade1418f91dce9bc8723219c7dc98435713a7cb2e\": rpc error: code = NotFound desc = could not find container \"a51d5671e9a2e370039ba13ade1418f91dce9bc8723219c7dc98435713a7cb2e\": container with ID starting with a51d5671e9a2e370039ba13ade1418f91dce9bc8723219c7dc98435713a7cb2e not found: ID does not exist" Apr 22 19:39:09.860639 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.860622 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc"] Apr 22 19:39:09.864643 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:09.864619 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ed925-predictor-7887d85c84-772mc"] Apr 22 19:39:11.805879 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:11.805847 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" path="/var/lib/kubelet/pods/f536e031-82ee-46e3-8ca7-d53ad0285015/volumes" Apr 22 19:39:14.865896 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:14.865863 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns"] Apr 22 19:39:14.866298 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:14.866231 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="storage-initializer" Apr 22 19:39:14.866298 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:14.866242 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="storage-initializer" Apr 22 19:39:14.866298 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:14.866255 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" containerName="kserve-container" Apr 22 19:39:14.866298 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:14.866261 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" containerName="kserve-container" Apr 22 19:39:14.866298 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:14.866273 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" containerName="storage-initializer" Apr 22 19:39:14.866298 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:14.866279 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" containerName="storage-initializer" Apr 22 19:39:14.866298 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:14.866285 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="kserve-container" Apr 22 19:39:14.866298 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:14.866290 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="kserve-container" Apr 22 19:39:14.866549 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:14.866338 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f536e031-82ee-46e3-8ca7-d53ad0285015" containerName="kserve-container" Apr 22 19:39:14.866549 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:14.866348 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="b831095f-5903-4cd1-9637-21c2e4549f25" containerName="kserve-container" Apr 22 19:39:14.871003 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:14.870985 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" Apr 22 19:39:14.877402 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:14.877377 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns"] Apr 22 19:39:15.018566 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:15.018513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bf0aa43-6bf9-49fc-9734-167af6357e21-kserve-provision-location\") pod \"isvc-logger-raw-38710-predictor-655997d55d-tm5ns\" (UID: \"2bf0aa43-6bf9-49fc-9734-167af6357e21\") " pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" Apr 22 19:39:15.119510 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:15.119418 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bf0aa43-6bf9-49fc-9734-167af6357e21-kserve-provision-location\") pod \"isvc-logger-raw-38710-predictor-655997d55d-tm5ns\" (UID: \"2bf0aa43-6bf9-49fc-9734-167af6357e21\") " pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" Apr 22 19:39:15.119827 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:15.119806 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bf0aa43-6bf9-49fc-9734-167af6357e21-kserve-provision-location\") pod \"isvc-logger-raw-38710-predictor-655997d55d-tm5ns\" (UID: \"2bf0aa43-6bf9-49fc-9734-167af6357e21\") " pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" Apr 22 19:39:15.182268 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:15.182228 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" Apr 22 19:39:15.314593 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:15.314490 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns"] Apr 22 19:39:15.317227 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:39:15.317196 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bf0aa43_6bf9_49fc_9734_167af6357e21.slice/crio-87f377b86d7e0f08bccc575071dd2be27a5ab118608aeff99e7334a38a0a087b WatchSource:0}: Error finding container 87f377b86d7e0f08bccc575071dd2be27a5ab118608aeff99e7334a38a0a087b: Status 404 returned error can't find the container with id 87f377b86d7e0f08bccc575071dd2be27a5ab118608aeff99e7334a38a0a087b Apr 22 19:39:15.869478 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:15.869442 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" event={"ID":"2bf0aa43-6bf9-49fc-9734-167af6357e21","Type":"ContainerStarted","Data":"c43298da33070d4d3dcbddb273259875f47b34bda1a39472fffe321b94f69220"} Apr 22 19:39:15.869478 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:15.869482 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" event={"ID":"2bf0aa43-6bf9-49fc-9734-167af6357e21","Type":"ContainerStarted","Data":"87f377b86d7e0f08bccc575071dd2be27a5ab118608aeff99e7334a38a0a087b"} Apr 22 19:39:19.883229 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:19.883196 2579 generic.go:358] "Generic (PLEG): container finished" podID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerID="c43298da33070d4d3dcbddb273259875f47b34bda1a39472fffe321b94f69220" exitCode=0 Apr 22 19:39:19.883619 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:19.883271 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" event={"ID":"2bf0aa43-6bf9-49fc-9734-167af6357e21","Type":"ContainerDied","Data":"c43298da33070d4d3dcbddb273259875f47b34bda1a39472fffe321b94f69220"} Apr 22 19:39:20.887950 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:20.887912 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" event={"ID":"2bf0aa43-6bf9-49fc-9734-167af6357e21","Type":"ContainerStarted","Data":"368dfaef8fbb6d79a4df3f1f62accab5276d1b81ed5db934a11f588d610eec02"} Apr 22 19:39:20.888352 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:20.887955 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" event={"ID":"2bf0aa43-6bf9-49fc-9734-167af6357e21","Type":"ContainerStarted","Data":"9d7a5402258d9992a90f08a2378d19a9957f54f67ae7c8904445ae643bfeea32"} Apr 22 19:39:20.888352 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:20.888272 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" Apr 22 19:39:20.888352 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:20.888306 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" Apr 22 19:39:20.889653 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:20.889620 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:39:20.890424 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:20.890399 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:20.907646 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:20.907595 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podStartSLOduration=6.907583847 podStartE2EDuration="6.907583847s" podCreationTimestamp="2026-04-22 19:39:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:39:20.905511466 +0000 UTC m=+963.710352769" watchObservedRunningTime="2026-04-22 19:39:20.907583847 +0000 UTC m=+963.712425151" Apr 22 19:39:21.890919 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:21.890868 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:39:21.891309 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:21.891230 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:31.891256 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:31.891207 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:39:31.891685 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:31.891621 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:41.891258 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:41.891202 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:39:41.891811 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:41.891667 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:39:51.891197 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:51.891153 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:39:51.891628 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:39:51.891552 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:01.891265 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:01.891213 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:40:01.891710 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:01.891621 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:11.891519 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:11.891463 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:40:11.892031 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:11.892001 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:21.891671 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:21.891610 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:40:21.892129 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:21.892091 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:31.891371 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:31.891340 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" Apr 22 19:40:31.891798 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:31.891532 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" Apr 22 19:40:39.900351 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:39.900322 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-38710-predictor-7d9b68b89-9f6r8_fecab682-dbf8-473b-b88e-8227390b7c39/kserve-container/0.log" Apr 22 19:40:40.091453 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.091414 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns"] Apr 22 19:40:40.091936 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.091897 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" containerID="cri-o://9d7a5402258d9992a90f08a2378d19a9957f54f67ae7c8904445ae643bfeea32" gracePeriod=30 Apr 22 19:40:40.092088 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.092061 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" containerID="cri-o://368dfaef8fbb6d79a4df3f1f62accab5276d1b81ed5db934a11f588d610eec02" gracePeriod=30 Apr 22 19:40:40.122452 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.122418 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5"] Apr 22 19:40:40.125738 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.125706 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" Apr 22 19:40:40.137011 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.136987 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5"] Apr 22 19:40:40.159544 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.159456 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/456e4986-41dd-4d49-9586-74a584b69381-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5\" (UID: \"456e4986-41dd-4d49-9586-74a584b69381\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" Apr 22 19:40:40.206606 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.206571 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8"] Apr 22 19:40:40.206938 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.206909 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8" podUID="fecab682-dbf8-473b-b88e-8227390b7c39" containerName="kserve-container" containerID="cri-o://d1bad8663b207f75269c04e4075b534a93d0bf4f7df6343b25c27083bc74fd04" gracePeriod=30 Apr 22 19:40:40.259901 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.259863 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/456e4986-41dd-4d49-9586-74a584b69381-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5\" (UID: \"456e4986-41dd-4d49-9586-74a584b69381\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" Apr 22 19:40:40.260229 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.260204 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/456e4986-41dd-4d49-9586-74a584b69381-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5\" (UID: \"456e4986-41dd-4d49-9586-74a584b69381\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" Apr 22 19:40:40.436544 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.436516 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" Apr 22 19:40:40.460434 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.460404 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8" Apr 22 19:40:40.561150 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:40.561090 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5"] Apr 22 19:40:40.563557 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:40:40.563526 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod456e4986_41dd_4d49_9586_74a584b69381.slice/crio-6be47154043ea4ab65a78efd6b53e626e2312125128bf86994cb0c3bc941e8d9 WatchSource:0}: Error finding container 6be47154043ea4ab65a78efd6b53e626e2312125128bf86994cb0c3bc941e8d9: Status 404 returned error can't find the container with id 6be47154043ea4ab65a78efd6b53e626e2312125128bf86994cb0c3bc941e8d9 Apr 22 19:40:41.147750 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.147693 2579 generic.go:358] "Generic (PLEG): container finished" podID="fecab682-dbf8-473b-b88e-8227390b7c39" containerID="d1bad8663b207f75269c04e4075b534a93d0bf4f7df6343b25c27083bc74fd04" exitCode=2 Apr 22 19:40:41.148193 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.147773 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8" Apr 22 19:40:41.148193 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.147784 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8" event={"ID":"fecab682-dbf8-473b-b88e-8227390b7c39","Type":"ContainerDied","Data":"d1bad8663b207f75269c04e4075b534a93d0bf4f7df6343b25c27083bc74fd04"} Apr 22 19:40:41.148193 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.147829 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8" event={"ID":"fecab682-dbf8-473b-b88e-8227390b7c39","Type":"ContainerDied","Data":"71f9dc7e85fa9fa7220961bc5be3ca5837802e5a002f33778481fa9e4d0cdef9"} Apr 22 19:40:41.148193 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.147851 2579 scope.go:117] "RemoveContainer" containerID="d1bad8663b207f75269c04e4075b534a93d0bf4f7df6343b25c27083bc74fd04" Apr 22 19:40:41.149464 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.149431 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" event={"ID":"456e4986-41dd-4d49-9586-74a584b69381","Type":"ContainerStarted","Data":"e880ac576f9012c438f8f0053dbb7ecad89dd10f3ec7cc4dc6dad8547042f31b"} Apr 22 19:40:41.149591 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.149484 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" event={"ID":"456e4986-41dd-4d49-9586-74a584b69381","Type":"ContainerStarted","Data":"6be47154043ea4ab65a78efd6b53e626e2312125128bf86994cb0c3bc941e8d9"} Apr 22 19:40:41.157825 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.157809 2579 scope.go:117] "RemoveContainer" containerID="d1bad8663b207f75269c04e4075b534a93d0bf4f7df6343b25c27083bc74fd04" Apr 22 19:40:41.158116 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:40:41.158096 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1bad8663b207f75269c04e4075b534a93d0bf4f7df6343b25c27083bc74fd04\": container with ID starting with d1bad8663b207f75269c04e4075b534a93d0bf4f7df6343b25c27083bc74fd04 not found: ID does not exist" containerID="d1bad8663b207f75269c04e4075b534a93d0bf4f7df6343b25c27083bc74fd04" Apr 22 19:40:41.158190 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.158127 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1bad8663b207f75269c04e4075b534a93d0bf4f7df6343b25c27083bc74fd04"} err="failed to get container status \"d1bad8663b207f75269c04e4075b534a93d0bf4f7df6343b25c27083bc74fd04\": rpc error: code = NotFound desc = could not find container \"d1bad8663b207f75269c04e4075b534a93d0bf4f7df6343b25c27083bc74fd04\": container with ID starting with d1bad8663b207f75269c04e4075b534a93d0bf4f7df6343b25c27083bc74fd04 not found: ID does not exist" Apr 22 19:40:41.177986 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.177951 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8"] Apr 22 19:40:41.181237 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.181208 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-38710-predictor-7d9b68b89-9f6r8"] Apr 22 19:40:41.806252 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.806217 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fecab682-dbf8-473b-b88e-8227390b7c39" path="/var/lib/kubelet/pods/fecab682-dbf8-473b-b88e-8227390b7c39/volumes" Apr 22 19:40:41.890804 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.890756 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:40:41.891143 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:41.891113 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:45.166988 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:45.166954 2579 generic.go:358] "Generic (PLEG): container finished" podID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerID="9d7a5402258d9992a90f08a2378d19a9957f54f67ae7c8904445ae643bfeea32" exitCode=0 Apr 22 19:40:45.167432 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:45.167029 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" event={"ID":"2bf0aa43-6bf9-49fc-9734-167af6357e21","Type":"ContainerDied","Data":"9d7a5402258d9992a90f08a2378d19a9957f54f67ae7c8904445ae643bfeea32"} Apr 22 19:40:45.168339 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:45.168319 2579 generic.go:358] "Generic (PLEG): container finished" podID="456e4986-41dd-4d49-9586-74a584b69381" containerID="e880ac576f9012c438f8f0053dbb7ecad89dd10f3ec7cc4dc6dad8547042f31b" exitCode=0 Apr 22 19:40:45.168455 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:45.168371 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" event={"ID":"456e4986-41dd-4d49-9586-74a584b69381","Type":"ContainerDied","Data":"e880ac576f9012c438f8f0053dbb7ecad89dd10f3ec7cc4dc6dad8547042f31b"} Apr 22 19:40:46.173709 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:46.173673 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" event={"ID":"456e4986-41dd-4d49-9586-74a584b69381","Type":"ContainerStarted","Data":"ccb857037f167f1ab84782da47a0e51e9b113ed5c2f92e85310cf2b9bb5b1961"} Apr 22 19:40:46.174174 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:46.173989 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" Apr 22 19:40:46.175438 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:46.175409 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:40:46.193196 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:46.193138 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podStartSLOduration=6.193124425 podStartE2EDuration="6.193124425s" podCreationTimestamp="2026-04-22 19:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:40:46.191152512 +0000 UTC m=+1048.995993829" watchObservedRunningTime="2026-04-22 19:40:46.193124425 +0000 UTC m=+1048.997965729" Apr 22 19:40:47.178332 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:47.178283 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:40:51.891501 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:51.891449 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:40:51.891979 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:51.891794 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:40:57.178638 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:40:57.178588 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:41:01.891261 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:01.891211 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 22 19:41:01.891793 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:01.891364 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" Apr 22 19:41:01.891793 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:01.891563 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:41:01.891793 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:01.891782 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" Apr 22 19:41:07.179213 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:07.179166 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:41:10.258597 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:10.258564 2579 generic.go:358] "Generic (PLEG): container finished" podID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerID="368dfaef8fbb6d79a4df3f1f62accab5276d1b81ed5db934a11f588d610eec02" exitCode=137 Apr 22 19:41:10.258971 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:10.258621 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" event={"ID":"2bf0aa43-6bf9-49fc-9734-167af6357e21","Type":"ContainerDied","Data":"368dfaef8fbb6d79a4df3f1f62accab5276d1b81ed5db934a11f588d610eec02"} Apr 22 19:41:10.274611 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:10.274586 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" Apr 22 19:41:10.428055 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:10.428012 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bf0aa43-6bf9-49fc-9734-167af6357e21-kserve-provision-location\") pod \"2bf0aa43-6bf9-49fc-9734-167af6357e21\" (UID: \"2bf0aa43-6bf9-49fc-9734-167af6357e21\") " Apr 22 19:41:10.428236 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:10.428200 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bf0aa43-6bf9-49fc-9734-167af6357e21-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2bf0aa43-6bf9-49fc-9734-167af6357e21" (UID: "2bf0aa43-6bf9-49fc-9734-167af6357e21"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:41:10.428330 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:10.428317 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bf0aa43-6bf9-49fc-9734-167af6357e21-kserve-provision-location\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:41:11.263534 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:11.263501 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" event={"ID":"2bf0aa43-6bf9-49fc-9734-167af6357e21","Type":"ContainerDied","Data":"87f377b86d7e0f08bccc575071dd2be27a5ab118608aeff99e7334a38a0a087b"} Apr 22 19:41:11.263534 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:11.263520 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns" Apr 22 19:41:11.263534 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:11.263545 2579 scope.go:117] "RemoveContainer" containerID="368dfaef8fbb6d79a4df3f1f62accab5276d1b81ed5db934a11f588d610eec02" Apr 22 19:41:11.271824 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:11.271649 2579 scope.go:117] "RemoveContainer" containerID="9d7a5402258d9992a90f08a2378d19a9957f54f67ae7c8904445ae643bfeea32" Apr 22 19:41:11.279276 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:11.279258 2579 scope.go:117] "RemoveContainer" containerID="c43298da33070d4d3dcbddb273259875f47b34bda1a39472fffe321b94f69220" Apr 22 19:41:11.287166 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:11.287141 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns"] Apr 22 19:41:11.291591 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:11.291564 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-38710-predictor-655997d55d-tm5ns"] Apr 22 19:41:11.805061 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:11.805028 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" path="/var/lib/kubelet/pods/2bf0aa43-6bf9-49fc-9734-167af6357e21/volumes" Apr 22 19:41:17.179267 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:17.179215 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:41:27.178935 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:27.178888 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:41:37.178340 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:37.178289 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:41:47.179191 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:47.179137 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:41:57.178892 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:41:57.178840 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:42:07.178934 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:42:07.178884 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:42:13.801297 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:42:13.801250 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:42:23.801788 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:42:23.801715 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:42:33.801746 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:42:33.801631 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:42:43.801996 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:42:43.801939 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:42:53.801512 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:42:53.801459 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:43:03.806961 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:03.806928 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" Apr 22 19:43:10.299811 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.299769 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5"] Apr 22 19:43:10.300271 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.300190 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" containerID="cri-o://ccb857037f167f1ab84782da47a0e51e9b113ed5c2f92e85310cf2b9bb5b1961" gracePeriod=30 Apr 22 19:43:10.390046 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.390011 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv"] Apr 22 19:43:10.390366 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.390354 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" Apr 22 19:43:10.390411 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.390368 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" Apr 22 19:43:10.390411 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.390390 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fecab682-dbf8-473b-b88e-8227390b7c39" containerName="kserve-container" Apr 22 19:43:10.390411 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.390396 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecab682-dbf8-473b-b88e-8227390b7c39" containerName="kserve-container" Apr 22 19:43:10.390411 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.390405 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="storage-initializer" Apr 22 19:43:10.390411 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.390410 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="storage-initializer" Apr 22 19:43:10.390559 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.390417 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" Apr 22 19:43:10.390559 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.390422 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" Apr 22 19:43:10.390559 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.390480 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="agent" Apr 22 19:43:10.390559 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.390487 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2bf0aa43-6bf9-49fc-9734-167af6357e21" containerName="kserve-container" Apr 22 19:43:10.390559 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.390494 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="fecab682-dbf8-473b-b88e-8227390b7c39" containerName="kserve-container" Apr 22 19:43:10.393524 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.393501 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" Apr 22 19:43:10.399474 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.399443 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv"] Apr 22 19:43:10.477460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.477421 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0726536f-15ae-435f-b928-5d765df077ce-kserve-provision-location\") pod \"isvc-primary-cc83d8-predictor-858d4476f-qxlcv\" (UID: \"0726536f-15ae-435f-b928-5d765df077ce\") " pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" Apr 22 19:43:10.578619 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.578509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0726536f-15ae-435f-b928-5d765df077ce-kserve-provision-location\") pod \"isvc-primary-cc83d8-predictor-858d4476f-qxlcv\" (UID: \"0726536f-15ae-435f-b928-5d765df077ce\") " pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" Apr 22 19:43:10.578939 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.578917 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0726536f-15ae-435f-b928-5d765df077ce-kserve-provision-location\") pod \"isvc-primary-cc83d8-predictor-858d4476f-qxlcv\" (UID: \"0726536f-15ae-435f-b928-5d765df077ce\") " pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" Apr 22 19:43:10.706129 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.706089 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" Apr 22 19:43:10.837981 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:10.837904 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv"] Apr 22 19:43:10.840840 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:43:10.840808 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0726536f_15ae_435f_b928_5d765df077ce.slice/crio-717d18ee85e9cad0521d97444a436df7b7fcabc90a5b5743f08e1b2a6242718f WatchSource:0}: Error finding container 717d18ee85e9cad0521d97444a436df7b7fcabc90a5b5743f08e1b2a6242718f: Status 404 returned error can't find the container with id 717d18ee85e9cad0521d97444a436df7b7fcabc90a5b5743f08e1b2a6242718f Apr 22 19:43:11.641988 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:11.641949 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" event={"ID":"0726536f-15ae-435f-b928-5d765df077ce","Type":"ContainerStarted","Data":"bd6e136bab7fc3ae6a63dd735198aa773bc92879976b1e685277dbb7f1e5df9b"} Apr 22 19:43:11.641988 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:11.641993 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" event={"ID":"0726536f-15ae-435f-b928-5d765df077ce","Type":"ContainerStarted","Data":"717d18ee85e9cad0521d97444a436df7b7fcabc90a5b5743f08e1b2a6242718f"} Apr 22 19:43:13.801546 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:13.801481 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 22 19:43:15.655428 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:15.655391 2579 generic.go:358] "Generic (PLEG): container finished" podID="0726536f-15ae-435f-b928-5d765df077ce" containerID="bd6e136bab7fc3ae6a63dd735198aa773bc92879976b1e685277dbb7f1e5df9b" exitCode=0 Apr 22 19:43:15.655827 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:15.655467 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" event={"ID":"0726536f-15ae-435f-b928-5d765df077ce","Type":"ContainerDied","Data":"bd6e136bab7fc3ae6a63dd735198aa773bc92879976b1e685277dbb7f1e5df9b"} Apr 22 19:43:16.659815 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:16.659773 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" event={"ID":"0726536f-15ae-435f-b928-5d765df077ce","Type":"ContainerStarted","Data":"18519ba4d65b023639f04bf8b7cb0a316d020c8ee43a90087b50351db190b8df"} Apr 22 19:43:16.660189 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:16.660069 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" Apr 22 19:43:16.661428 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:16.661402 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:43:16.675185 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:16.675131 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" podStartSLOduration=6.675117643 podStartE2EDuration="6.675117643s" podCreationTimestamp="2026-04-22 19:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:43:16.673692075 +0000 UTC m=+1199.478533414" watchObservedRunningTime="2026-04-22 19:43:16.675117643 +0000 UTC m=+1199.479958946" Apr 22 19:43:17.663852 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:17.663818 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:43:17.762260 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:17.762228 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:43:17.762434 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:17.762397 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:43:19.652032 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.652007 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" Apr 22 19:43:19.673713 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.673682 2579 generic.go:358] "Generic (PLEG): container finished" podID="456e4986-41dd-4d49-9586-74a584b69381" containerID="ccb857037f167f1ab84782da47a0e51e9b113ed5c2f92e85310cf2b9bb5b1961" exitCode=0 Apr 22 19:43:19.673869 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.673769 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" Apr 22 19:43:19.673869 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.673773 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" event={"ID":"456e4986-41dd-4d49-9586-74a584b69381","Type":"ContainerDied","Data":"ccb857037f167f1ab84782da47a0e51e9b113ed5c2f92e85310cf2b9bb5b1961"} Apr 22 19:43:19.673869 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.673820 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5" event={"ID":"456e4986-41dd-4d49-9586-74a584b69381","Type":"ContainerDied","Data":"6be47154043ea4ab65a78efd6b53e626e2312125128bf86994cb0c3bc941e8d9"} Apr 22 19:43:19.673869 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.673843 2579 scope.go:117] "RemoveContainer" containerID="ccb857037f167f1ab84782da47a0e51e9b113ed5c2f92e85310cf2b9bb5b1961" Apr 22 19:43:19.681557 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.681543 2579 scope.go:117] "RemoveContainer" containerID="e880ac576f9012c438f8f0053dbb7ecad89dd10f3ec7cc4dc6dad8547042f31b" Apr 22 19:43:19.689311 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.689290 2579 scope.go:117] "RemoveContainer" containerID="ccb857037f167f1ab84782da47a0e51e9b113ed5c2f92e85310cf2b9bb5b1961" Apr 22 19:43:19.689597 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:43:19.689573 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb857037f167f1ab84782da47a0e51e9b113ed5c2f92e85310cf2b9bb5b1961\": container with ID starting with ccb857037f167f1ab84782da47a0e51e9b113ed5c2f92e85310cf2b9bb5b1961 not found: ID does not exist" containerID="ccb857037f167f1ab84782da47a0e51e9b113ed5c2f92e85310cf2b9bb5b1961" Apr 22 19:43:19.689661 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.689608 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb857037f167f1ab84782da47a0e51e9b113ed5c2f92e85310cf2b9bb5b1961"} err="failed to get container status \"ccb857037f167f1ab84782da47a0e51e9b113ed5c2f92e85310cf2b9bb5b1961\": rpc error: code = NotFound desc = could not find container \"ccb857037f167f1ab84782da47a0e51e9b113ed5c2f92e85310cf2b9bb5b1961\": container with ID starting with ccb857037f167f1ab84782da47a0e51e9b113ed5c2f92e85310cf2b9bb5b1961 not found: ID does not exist" Apr 22 19:43:19.689661 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.689628 2579 scope.go:117] "RemoveContainer" containerID="e880ac576f9012c438f8f0053dbb7ecad89dd10f3ec7cc4dc6dad8547042f31b" Apr 22 19:43:19.689909 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:43:19.689891 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e880ac576f9012c438f8f0053dbb7ecad89dd10f3ec7cc4dc6dad8547042f31b\": container with ID starting with e880ac576f9012c438f8f0053dbb7ecad89dd10f3ec7cc4dc6dad8547042f31b not found: ID does not exist" containerID="e880ac576f9012c438f8f0053dbb7ecad89dd10f3ec7cc4dc6dad8547042f31b" Apr 22 19:43:19.689975 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.689916 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e880ac576f9012c438f8f0053dbb7ecad89dd10f3ec7cc4dc6dad8547042f31b"} err="failed to get container status \"e880ac576f9012c438f8f0053dbb7ecad89dd10f3ec7cc4dc6dad8547042f31b\": rpc error: code = NotFound desc = could not find container \"e880ac576f9012c438f8f0053dbb7ecad89dd10f3ec7cc4dc6dad8547042f31b\": container with ID starting with e880ac576f9012c438f8f0053dbb7ecad89dd10f3ec7cc4dc6dad8547042f31b not found: ID does not exist" Apr 22 19:43:19.753680 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.753643 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/456e4986-41dd-4d49-9586-74a584b69381-kserve-provision-location\") pod \"456e4986-41dd-4d49-9586-74a584b69381\" (UID: \"456e4986-41dd-4d49-9586-74a584b69381\") " Apr 22 19:43:19.753985 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.753961 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/456e4986-41dd-4d49-9586-74a584b69381-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "456e4986-41dd-4d49-9586-74a584b69381" (UID: "456e4986-41dd-4d49-9586-74a584b69381"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:43:19.854496 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.854456 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/456e4986-41dd-4d49-9586-74a584b69381-kserve-provision-location\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:43:19.989312 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.989240 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5"] Apr 22 19:43:19.994687 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:19.994663 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-0b5d7-predictor-67945bd99d-flhb5"] Apr 22 19:43:21.806318 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:21.806284 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="456e4986-41dd-4d49-9586-74a584b69381" path="/var/lib/kubelet/pods/456e4986-41dd-4d49-9586-74a584b69381/volumes" Apr 22 19:43:27.664510 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:27.664460 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:43:37.664511 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:37.664462 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:43:47.664610 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:47.664561 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:43:57.664392 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:43:57.664351 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:44:07.663980 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:07.663932 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:44:17.664265 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:17.664219 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:44:27.665166 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:27.665132 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" Apr 22 19:44:30.516973 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.516933 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg"] Apr 22 19:44:30.517382 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.517274 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="storage-initializer" Apr 22 19:44:30.517382 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.517285 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="storage-initializer" Apr 22 19:44:30.517382 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.517297 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" Apr 22 19:44:30.517382 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.517303 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" Apr 22 19:44:30.517382 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.517375 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="456e4986-41dd-4d49-9586-74a584b69381" containerName="kserve-container" Apr 22 19:44:30.521481 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.521458 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" Apr 22 19:44:30.524460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.524435 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-cc83d8-dockercfg-w5n5x\"" Apr 22 19:44:30.524460 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.524451 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-cc83d8\"" Apr 22 19:44:30.525542 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.525525 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 19:44:30.530384 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.530361 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg"] Apr 22 19:44:30.678049 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.678011 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/065d7847-cf9d-443f-b908-51d3f0f6786a-cabundle-cert\") pod \"isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg\" (UID: \"065d7847-cf9d-443f-b908-51d3f0f6786a\") " pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" Apr 22 19:44:30.678233 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.678063 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/065d7847-cf9d-443f-b908-51d3f0f6786a-kserve-provision-location\") pod \"isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg\" (UID: \"065d7847-cf9d-443f-b908-51d3f0f6786a\") " pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" Apr 22 19:44:30.779253 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.779175 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/065d7847-cf9d-443f-b908-51d3f0f6786a-cabundle-cert\") pod \"isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg\" (UID: \"065d7847-cf9d-443f-b908-51d3f0f6786a\") " pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" Apr 22 19:44:30.779253 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.779219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/065d7847-cf9d-443f-b908-51d3f0f6786a-kserve-provision-location\") pod \"isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg\" (UID: \"065d7847-cf9d-443f-b908-51d3f0f6786a\") " pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" Apr 22 19:44:30.779578 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.779562 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/065d7847-cf9d-443f-b908-51d3f0f6786a-kserve-provision-location\") pod \"isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg\" (UID: \"065d7847-cf9d-443f-b908-51d3f0f6786a\") " pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" Apr 22 19:44:30.779889 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.779862 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/065d7847-cf9d-443f-b908-51d3f0f6786a-cabundle-cert\") pod \"isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg\" (UID: \"065d7847-cf9d-443f-b908-51d3f0f6786a\") " pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" Apr 22 19:44:30.833340 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.833297 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" Apr 22 19:44:30.960784 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.960713 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg"] Apr 22 19:44:30.965612 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:44:30.965569 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod065d7847_cf9d_443f_b908_51d3f0f6786a.slice/crio-a6758e3d28f45df64b5d08c6507638fbcbf72882ade95fffbdf36d81a375e12b WatchSource:0}: Error finding container a6758e3d28f45df64b5d08c6507638fbcbf72882ade95fffbdf36d81a375e12b: Status 404 returned error can't find the container with id a6758e3d28f45df64b5d08c6507638fbcbf72882ade95fffbdf36d81a375e12b Apr 22 19:44:30.968013 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:30.967990 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:44:31.900626 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:31.900588 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" event={"ID":"065d7847-cf9d-443f-b908-51d3f0f6786a","Type":"ContainerStarted","Data":"5634607b0df6f01f6c25f77ee2138e1a490e7b5cc129569caedddbcf83e9056b"} Apr 22 19:44:31.900626 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:31.900628 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" event={"ID":"065d7847-cf9d-443f-b908-51d3f0f6786a","Type":"ContainerStarted","Data":"a6758e3d28f45df64b5d08c6507638fbcbf72882ade95fffbdf36d81a375e12b"} Apr 22 19:44:34.913888 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:34.913856 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg_065d7847-cf9d-443f-b908-51d3f0f6786a/storage-initializer/0.log" Apr 22 19:44:34.914320 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:34.913901 2579 generic.go:358] "Generic (PLEG): container finished" podID="065d7847-cf9d-443f-b908-51d3f0f6786a" containerID="5634607b0df6f01f6c25f77ee2138e1a490e7b5cc129569caedddbcf83e9056b" exitCode=1 Apr 22 19:44:34.914320 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:34.913984 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" event={"ID":"065d7847-cf9d-443f-b908-51d3f0f6786a","Type":"ContainerDied","Data":"5634607b0df6f01f6c25f77ee2138e1a490e7b5cc129569caedddbcf83e9056b"} Apr 22 19:44:35.918861 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:35.918831 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg_065d7847-cf9d-443f-b908-51d3f0f6786a/storage-initializer/0.log" Apr 22 19:44:35.919267 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:35.918920 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" event={"ID":"065d7847-cf9d-443f-b908-51d3f0f6786a","Type":"ContainerStarted","Data":"01befe6fca932beaaf7b3a0e5557f3afef8d0ce193709f1195c5c8e14aed41b5"} Apr 22 19:44:39.931989 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:39.931960 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg_065d7847-cf9d-443f-b908-51d3f0f6786a/storage-initializer/1.log" Apr 22 19:44:39.932377 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:39.932297 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg_065d7847-cf9d-443f-b908-51d3f0f6786a/storage-initializer/0.log" Apr 22 19:44:39.932377 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:39.932329 2579 generic.go:358] "Generic (PLEG): container finished" podID="065d7847-cf9d-443f-b908-51d3f0f6786a" containerID="01befe6fca932beaaf7b3a0e5557f3afef8d0ce193709f1195c5c8e14aed41b5" exitCode=1 Apr 22 19:44:39.932459 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:39.932403 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" event={"ID":"065d7847-cf9d-443f-b908-51d3f0f6786a","Type":"ContainerDied","Data":"01befe6fca932beaaf7b3a0e5557f3afef8d0ce193709f1195c5c8e14aed41b5"} Apr 22 19:44:39.932459 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:39.932451 2579 scope.go:117] "RemoveContainer" containerID="5634607b0df6f01f6c25f77ee2138e1a490e7b5cc129569caedddbcf83e9056b" Apr 22 19:44:39.932835 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:39.932809 2579 scope.go:117] "RemoveContainer" containerID="5634607b0df6f01f6c25f77ee2138e1a490e7b5cc129569caedddbcf83e9056b" Apr 22 19:44:39.942903 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:44:39.942872 2579 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg_kserve-ci-e2e-test_065d7847-cf9d-443f-b908-51d3f0f6786a_0 in pod sandbox a6758e3d28f45df64b5d08c6507638fbcbf72882ade95fffbdf36d81a375e12b from index: no such id: '5634607b0df6f01f6c25f77ee2138e1a490e7b5cc129569caedddbcf83e9056b'" containerID="5634607b0df6f01f6c25f77ee2138e1a490e7b5cc129569caedddbcf83e9056b" Apr 22 19:44:39.942989 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:44:39.942930 2579 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg_kserve-ci-e2e-test_065d7847-cf9d-443f-b908-51d3f0f6786a_0 in pod sandbox a6758e3d28f45df64b5d08c6507638fbcbf72882ade95fffbdf36d81a375e12b from index: no such id: '5634607b0df6f01f6c25f77ee2138e1a490e7b5cc129569caedddbcf83e9056b'; Skipping pod \"isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg_kserve-ci-e2e-test(065d7847-cf9d-443f-b908-51d3f0f6786a)\"" logger="UnhandledError" Apr 22 19:44:39.944272 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:44:39.944244 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg_kserve-ci-e2e-test(065d7847-cf9d-443f-b908-51d3f0f6786a)\"" pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" podUID="065d7847-cf9d-443f-b908-51d3f0f6786a" Apr 22 19:44:40.937285 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:40.937257 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg_065d7847-cf9d-443f-b908-51d3f0f6786a/storage-initializer/1.log" Apr 22 19:44:46.549883 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.549849 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv"] Apr 22 19:44:46.550369 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.550142 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="kserve-container" containerID="cri-o://18519ba4d65b023639f04bf8b7cb0a316d020c8ee43a90087b50351db190b8df" gracePeriod=30 Apr 22 19:44:46.622078 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.622043 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg"] Apr 22 19:44:46.701557 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.701520 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj"] Apr 22 19:44:46.707713 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.707688 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" Apr 22 19:44:46.710465 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.710440 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-21c95a-dockercfg-v4z2s\"" Apr 22 19:44:46.710607 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.710466 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-21c95a\"" Apr 22 19:44:46.714930 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.714901 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj"] Apr 22 19:44:46.756220 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.756199 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg_065d7847-cf9d-443f-b908-51d3f0f6786a/storage-initializer/1.log" Apr 22 19:44:46.756360 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.756261 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" Apr 22 19:44:46.816367 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.816256 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/af78fa9e-34be-4a42-9242-5feb37d1e364-cabundle-cert\") pod \"isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj\" (UID: \"af78fa9e-34be-4a42-9242-5feb37d1e364\") " pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" Apr 22 19:44:46.816555 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.816432 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af78fa9e-34be-4a42-9242-5feb37d1e364-kserve-provision-location\") pod \"isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj\" (UID: \"af78fa9e-34be-4a42-9242-5feb37d1e364\") " pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" Apr 22 19:44:46.917462 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.917421 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/065d7847-cf9d-443f-b908-51d3f0f6786a-cabundle-cert\") pod \"065d7847-cf9d-443f-b908-51d3f0f6786a\" (UID: \"065d7847-cf9d-443f-b908-51d3f0f6786a\") " Apr 22 19:44:46.917610 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.917534 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/065d7847-cf9d-443f-b908-51d3f0f6786a-kserve-provision-location\") pod \"065d7847-cf9d-443f-b908-51d3f0f6786a\" (UID: \"065d7847-cf9d-443f-b908-51d3f0f6786a\") " Apr 22 19:44:46.917784 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.917762 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/af78fa9e-34be-4a42-9242-5feb37d1e364-cabundle-cert\") pod \"isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj\" (UID: \"af78fa9e-34be-4a42-9242-5feb37d1e364\") " pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" Apr 22 19:44:46.917887 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.917860 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/065d7847-cf9d-443f-b908-51d3f0f6786a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "065d7847-cf9d-443f-b908-51d3f0f6786a" (UID: "065d7847-cf9d-443f-b908-51d3f0f6786a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:44:46.917947 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.917894 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af78fa9e-34be-4a42-9242-5feb37d1e364-kserve-provision-location\") pod \"isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj\" (UID: \"af78fa9e-34be-4a42-9242-5feb37d1e364\") " pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" Apr 22 19:44:46.917947 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.917907 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/065d7847-cf9d-443f-b908-51d3f0f6786a-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "065d7847-cf9d-443f-b908-51d3f0f6786a" (UID: "065d7847-cf9d-443f-b908-51d3f0f6786a"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:44:46.918046 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.917947 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/065d7847-cf9d-443f-b908-51d3f0f6786a-kserve-provision-location\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:44:46.918260 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.918238 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af78fa9e-34be-4a42-9242-5feb37d1e364-kserve-provision-location\") pod \"isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj\" (UID: \"af78fa9e-34be-4a42-9242-5feb37d1e364\") " pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" Apr 22 19:44:46.918475 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.918453 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/af78fa9e-34be-4a42-9242-5feb37d1e364-cabundle-cert\") pod \"isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj\" (UID: \"af78fa9e-34be-4a42-9242-5feb37d1e364\") " pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" Apr 22 19:44:46.956350 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.956323 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg_065d7847-cf9d-443f-b908-51d3f0f6786a/storage-initializer/1.log" Apr 22 19:44:46.956533 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.956414 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" event={"ID":"065d7847-cf9d-443f-b908-51d3f0f6786a","Type":"ContainerDied","Data":"a6758e3d28f45df64b5d08c6507638fbcbf72882ade95fffbdf36d81a375e12b"} Apr 22 19:44:46.956533 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.956460 2579 scope.go:117] "RemoveContainer" containerID="01befe6fca932beaaf7b3a0e5557f3afef8d0ce193709f1195c5c8e14aed41b5" Apr 22 19:44:46.956533 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.956483 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg" Apr 22 19:44:46.994048 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.994012 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg"] Apr 22 19:44:46.997828 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:46.997798 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-cc83d8-predictor-75df94df6d-2dcrg"] Apr 22 19:44:47.019143 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:47.019109 2579 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/065d7847-cf9d-443f-b908-51d3f0f6786a-cabundle-cert\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:44:47.019914 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:47.019895 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" Apr 22 19:44:47.147660 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:47.147635 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj"] Apr 22 19:44:47.149616 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:44:47.149573 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf78fa9e_34be_4a42_9242_5feb37d1e364.slice/crio-75904bcb250acd0d2ae48a68e58382682b77f06f93570a909f3179f835047a1b WatchSource:0}: Error finding container 75904bcb250acd0d2ae48a68e58382682b77f06f93570a909f3179f835047a1b: Status 404 returned error can't find the container with id 75904bcb250acd0d2ae48a68e58382682b77f06f93570a909f3179f835047a1b Apr 22 19:44:47.664658 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:47.664611 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 22 19:44:47.805704 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:47.805673 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="065d7847-cf9d-443f-b908-51d3f0f6786a" path="/var/lib/kubelet/pods/065d7847-cf9d-443f-b908-51d3f0f6786a/volumes" Apr 22 19:44:47.962937 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:47.962841 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" event={"ID":"af78fa9e-34be-4a42-9242-5feb37d1e364","Type":"ContainerStarted","Data":"1fa024bc191f7d206e0d91a6a2d0c535a5f0d7d6ed22726ca6c8b5c12bbef443"} Apr 22 19:44:47.962937 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:47.962887 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" event={"ID":"af78fa9e-34be-4a42-9242-5feb37d1e364","Type":"ContainerStarted","Data":"75904bcb250acd0d2ae48a68e58382682b77f06f93570a909f3179f835047a1b"} Apr 22 19:44:51.194157 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.194133 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" Apr 22 19:44:51.251360 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.251324 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0726536f-15ae-435f-b928-5d765df077ce-kserve-provision-location\") pod \"0726536f-15ae-435f-b928-5d765df077ce\" (UID: \"0726536f-15ae-435f-b928-5d765df077ce\") " Apr 22 19:44:51.251691 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.251662 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0726536f-15ae-435f-b928-5d765df077ce-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0726536f-15ae-435f-b928-5d765df077ce" (UID: "0726536f-15ae-435f-b928-5d765df077ce"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:44:51.352118 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.352084 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0726536f-15ae-435f-b928-5d765df077ce-kserve-provision-location\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:44:51.978362 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.978323 2579 generic.go:358] "Generic (PLEG): container finished" podID="0726536f-15ae-435f-b928-5d765df077ce" containerID="18519ba4d65b023639f04bf8b7cb0a316d020c8ee43a90087b50351db190b8df" exitCode=0 Apr 22 19:44:51.978532 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.978396 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" event={"ID":"0726536f-15ae-435f-b928-5d765df077ce","Type":"ContainerDied","Data":"18519ba4d65b023639f04bf8b7cb0a316d020c8ee43a90087b50351db190b8df"} Apr 22 19:44:51.978532 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.978416 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" Apr 22 19:44:51.978532 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.978430 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv" event={"ID":"0726536f-15ae-435f-b928-5d765df077ce","Type":"ContainerDied","Data":"717d18ee85e9cad0521d97444a436df7b7fcabc90a5b5743f08e1b2a6242718f"} Apr 22 19:44:51.978532 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.978448 2579 scope.go:117] "RemoveContainer" containerID="18519ba4d65b023639f04bf8b7cb0a316d020c8ee43a90087b50351db190b8df" Apr 22 19:44:51.987337 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.987315 2579 scope.go:117] "RemoveContainer" containerID="bd6e136bab7fc3ae6a63dd735198aa773bc92879976b1e685277dbb7f1e5df9b" Apr 22 19:44:51.995146 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.995124 2579 scope.go:117] "RemoveContainer" containerID="18519ba4d65b023639f04bf8b7cb0a316d020c8ee43a90087b50351db190b8df" Apr 22 19:44:51.995354 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.995332 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv"] Apr 22 19:44:51.995440 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:44:51.995421 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18519ba4d65b023639f04bf8b7cb0a316d020c8ee43a90087b50351db190b8df\": container with ID starting with 18519ba4d65b023639f04bf8b7cb0a316d020c8ee43a90087b50351db190b8df not found: ID does not exist" containerID="18519ba4d65b023639f04bf8b7cb0a316d020c8ee43a90087b50351db190b8df" Apr 22 19:44:51.995480 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.995451 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18519ba4d65b023639f04bf8b7cb0a316d020c8ee43a90087b50351db190b8df"} err="failed to get container status \"18519ba4d65b023639f04bf8b7cb0a316d020c8ee43a90087b50351db190b8df\": rpc error: code = NotFound desc = could not find container \"18519ba4d65b023639f04bf8b7cb0a316d020c8ee43a90087b50351db190b8df\": container with ID starting with 18519ba4d65b023639f04bf8b7cb0a316d020c8ee43a90087b50351db190b8df not found: ID does not exist" Apr 22 19:44:51.995480 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.995469 2579 scope.go:117] "RemoveContainer" containerID="bd6e136bab7fc3ae6a63dd735198aa773bc92879976b1e685277dbb7f1e5df9b" Apr 22 19:44:51.995712 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:44:51.995695 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd6e136bab7fc3ae6a63dd735198aa773bc92879976b1e685277dbb7f1e5df9b\": container with ID starting with bd6e136bab7fc3ae6a63dd735198aa773bc92879976b1e685277dbb7f1e5df9b not found: ID does not exist" containerID="bd6e136bab7fc3ae6a63dd735198aa773bc92879976b1e685277dbb7f1e5df9b" Apr 22 19:44:51.995796 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.995722 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6e136bab7fc3ae6a63dd735198aa773bc92879976b1e685277dbb7f1e5df9b"} err="failed to get container status \"bd6e136bab7fc3ae6a63dd735198aa773bc92879976b1e685277dbb7f1e5df9b\": rpc error: code = NotFound desc = could not find container \"bd6e136bab7fc3ae6a63dd735198aa773bc92879976b1e685277dbb7f1e5df9b\": container with ID starting with bd6e136bab7fc3ae6a63dd735198aa773bc92879976b1e685277dbb7f1e5df9b not found: ID does not exist" Apr 22 19:44:51.998738 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:51.998708 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-cc83d8-predictor-858d4476f-qxlcv"] Apr 22 19:44:52.982864 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:52.982833 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj_af78fa9e-34be-4a42-9242-5feb37d1e364/storage-initializer/0.log" Apr 22 19:44:52.983332 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:52.982877 2579 generic.go:358] "Generic (PLEG): container finished" podID="af78fa9e-34be-4a42-9242-5feb37d1e364" containerID="1fa024bc191f7d206e0d91a6a2d0c535a5f0d7d6ed22726ca6c8b5c12bbef443" exitCode=1 Apr 22 19:44:52.983332 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:52.982970 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" event={"ID":"af78fa9e-34be-4a42-9242-5feb37d1e364","Type":"ContainerDied","Data":"1fa024bc191f7d206e0d91a6a2d0c535a5f0d7d6ed22726ca6c8b5c12bbef443"} Apr 22 19:44:53.806308 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:53.806274 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0726536f-15ae-435f-b928-5d765df077ce" path="/var/lib/kubelet/pods/0726536f-15ae-435f-b928-5d765df077ce/volumes" Apr 22 19:44:53.988573 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:53.988543 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj_af78fa9e-34be-4a42-9242-5feb37d1e364/storage-initializer/0.log" Apr 22 19:44:53.989004 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:53.988635 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" event={"ID":"af78fa9e-34be-4a42-9242-5feb37d1e364","Type":"ContainerStarted","Data":"284a7a51d34a2d99af5993637157f37ef7792fc86d1e56ac5da37992b84f3f2e"} Apr 22 19:44:56.710008 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.709972 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj"] Apr 22 19:44:56.710426 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.710278 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" podUID="af78fa9e-34be-4a42-9242-5feb37d1e364" containerName="storage-initializer" containerID="cri-o://284a7a51d34a2d99af5993637157f37ef7792fc86d1e56ac5da37992b84f3f2e" gracePeriod=30 Apr 22 19:44:56.906333 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.906294 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87"] Apr 22 19:44:56.906909 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.906883 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="storage-initializer" Apr 22 19:44:56.907075 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.907061 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="storage-initializer" Apr 22 19:44:56.907175 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.907153 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="065d7847-cf9d-443f-b908-51d3f0f6786a" containerName="storage-initializer" Apr 22 19:44:56.907175 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.907171 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="065d7847-cf9d-443f-b908-51d3f0f6786a" containerName="storage-initializer" Apr 22 19:44:56.907360 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.907205 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="kserve-container" Apr 22 19:44:56.907360 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.907216 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="kserve-container" Apr 22 19:44:56.907360 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.907317 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0726536f-15ae-435f-b928-5d765df077ce" containerName="kserve-container" Apr 22 19:44:56.907360 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.907328 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="065d7847-cf9d-443f-b908-51d3f0f6786a" containerName="storage-initializer" Apr 22 19:44:56.907360 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.907338 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="065d7847-cf9d-443f-b908-51d3f0f6786a" containerName="storage-initializer" Apr 22 19:44:56.907590 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.907428 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="065d7847-cf9d-443f-b908-51d3f0f6786a" containerName="storage-initializer" Apr 22 19:44:56.907590 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.907436 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="065d7847-cf9d-443f-b908-51d3f0f6786a" containerName="storage-initializer" Apr 22 19:44:56.910958 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.910938 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" Apr 22 19:44:56.913576 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.913548 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qbzcz\"" Apr 22 19:44:56.919040 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.919011 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87"] Apr 22 19:44:56.996167 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:56.996082 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00b8dce5-b230-4852-ad0b-e23ea55db2a3-kserve-provision-location\") pod \"raw-sklearn-942f9-predictor-8675645d6-74d87\" (UID: \"00b8dce5-b230-4852-ad0b-e23ea55db2a3\") " pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" Apr 22 19:44:57.096870 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:57.096826 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00b8dce5-b230-4852-ad0b-e23ea55db2a3-kserve-provision-location\") pod \"raw-sklearn-942f9-predictor-8675645d6-74d87\" (UID: \"00b8dce5-b230-4852-ad0b-e23ea55db2a3\") " pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" Apr 22 19:44:57.097213 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:57.097193 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00b8dce5-b230-4852-ad0b-e23ea55db2a3-kserve-provision-location\") pod \"raw-sklearn-942f9-predictor-8675645d6-74d87\" (UID: \"00b8dce5-b230-4852-ad0b-e23ea55db2a3\") " pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" Apr 22 19:44:57.222566 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:57.222534 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" Apr 22 19:44:57.348200 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:57.348136 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87"] Apr 22 19:44:57.350759 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:44:57.350711 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00b8dce5_b230_4852_ad0b_e23ea55db2a3.slice/crio-890817b5015b9c90f1f36745ae343561054eef90a3bfb724e005866ac93c77b1 WatchSource:0}: Error finding container 890817b5015b9c90f1f36745ae343561054eef90a3bfb724e005866ac93c77b1: Status 404 returned error can't find the container with id 890817b5015b9c90f1f36745ae343561054eef90a3bfb724e005866ac93c77b1 Apr 22 19:44:58.003805 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:58.003768 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" event={"ID":"00b8dce5-b230-4852-ad0b-e23ea55db2a3","Type":"ContainerStarted","Data":"4b3ddba252a882bc05c85b32078ed118f2e6d71a4a4a497b196070fbb23fb6c8"} Apr 22 19:44:58.003805 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:58.003806 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" event={"ID":"00b8dce5-b230-4852-ad0b-e23ea55db2a3","Type":"ContainerStarted","Data":"890817b5015b9c90f1f36745ae343561054eef90a3bfb724e005866ac93c77b1"} Apr 22 19:44:59.009283 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:59.009249 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj_af78fa9e-34be-4a42-9242-5feb37d1e364/storage-initializer/1.log" Apr 22 19:44:59.009639 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:59.009587 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj_af78fa9e-34be-4a42-9242-5feb37d1e364/storage-initializer/0.log" Apr 22 19:44:59.009639 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:59.009623 2579 generic.go:358] "Generic (PLEG): container finished" podID="af78fa9e-34be-4a42-9242-5feb37d1e364" containerID="284a7a51d34a2d99af5993637157f37ef7792fc86d1e56ac5da37992b84f3f2e" exitCode=1 Apr 22 19:44:59.009747 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:59.009707 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" event={"ID":"af78fa9e-34be-4a42-9242-5feb37d1e364","Type":"ContainerDied","Data":"284a7a51d34a2d99af5993637157f37ef7792fc86d1e56ac5da37992b84f3f2e"} Apr 22 19:44:59.009807 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:59.009775 2579 scope.go:117] "RemoveContainer" containerID="1fa024bc191f7d206e0d91a6a2d0c535a5f0d7d6ed22726ca6c8b5c12bbef443" Apr 22 19:44:59.063996 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:59.063971 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj_af78fa9e-34be-4a42-9242-5feb37d1e364/storage-initializer/1.log" Apr 22 19:44:59.064146 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:59.064071 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" Apr 22 19:44:59.114210 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:59.114121 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/af78fa9e-34be-4a42-9242-5feb37d1e364-cabundle-cert\") pod \"af78fa9e-34be-4a42-9242-5feb37d1e364\" (UID: \"af78fa9e-34be-4a42-9242-5feb37d1e364\") " Apr 22 19:44:59.114210 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:59.114166 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af78fa9e-34be-4a42-9242-5feb37d1e364-kserve-provision-location\") pod \"af78fa9e-34be-4a42-9242-5feb37d1e364\" (UID: \"af78fa9e-34be-4a42-9242-5feb37d1e364\") " Apr 22 19:44:59.114473 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:59.114449 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af78fa9e-34be-4a42-9242-5feb37d1e364-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "af78fa9e-34be-4a42-9242-5feb37d1e364" (UID: "af78fa9e-34be-4a42-9242-5feb37d1e364"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:44:59.114527 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:59.114466 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af78fa9e-34be-4a42-9242-5feb37d1e364-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "af78fa9e-34be-4a42-9242-5feb37d1e364" (UID: "af78fa9e-34be-4a42-9242-5feb37d1e364"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:44:59.215452 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:59.215414 2579 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/af78fa9e-34be-4a42-9242-5feb37d1e364-cabundle-cert\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:44:59.215452 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:44:59.215451 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af78fa9e-34be-4a42-9242-5feb37d1e364-kserve-provision-location\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:45:00.014406 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:00.014377 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj_af78fa9e-34be-4a42-9242-5feb37d1e364/storage-initializer/1.log" Apr 22 19:45:00.014850 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:00.014494 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" Apr 22 19:45:00.014850 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:00.014514 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj" event={"ID":"af78fa9e-34be-4a42-9242-5feb37d1e364","Type":"ContainerDied","Data":"75904bcb250acd0d2ae48a68e58382682b77f06f93570a909f3179f835047a1b"} Apr 22 19:45:00.014850 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:00.014555 2579 scope.go:117] "RemoveContainer" containerID="284a7a51d34a2d99af5993637157f37ef7792fc86d1e56ac5da37992b84f3f2e" Apr 22 19:45:00.044241 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:00.044208 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj"] Apr 22 19:45:00.048288 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:00.048256 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-21c95a-predictor-56c84f677d-7ktsj"] Apr 22 19:45:01.018631 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:01.018601 2579 generic.go:358] "Generic (PLEG): container finished" podID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerID="4b3ddba252a882bc05c85b32078ed118f2e6d71a4a4a497b196070fbb23fb6c8" exitCode=0 Apr 22 19:45:01.019043 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:01.018682 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" event={"ID":"00b8dce5-b230-4852-ad0b-e23ea55db2a3","Type":"ContainerDied","Data":"4b3ddba252a882bc05c85b32078ed118f2e6d71a4a4a497b196070fbb23fb6c8"} Apr 22 19:45:01.806474 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:01.806427 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af78fa9e-34be-4a42-9242-5feb37d1e364" path="/var/lib/kubelet/pods/af78fa9e-34be-4a42-9242-5feb37d1e364/volumes" Apr 22 19:45:02.025133 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:02.025092 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" event={"ID":"00b8dce5-b230-4852-ad0b-e23ea55db2a3","Type":"ContainerStarted","Data":"a81d68e3e14396e842bdc0b9e94a28ba2bac1b45e31607a128e436a06d4bbd4e"} Apr 22 19:45:02.025505 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:02.025413 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" Apr 22 19:45:02.026642 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:02.026612 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 19:45:02.043061 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:02.043006 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" podStartSLOduration=6.042987581 podStartE2EDuration="6.042987581s" podCreationTimestamp="2026-04-22 19:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:45:02.04082611 +0000 UTC m=+1304.845667415" watchObservedRunningTime="2026-04-22 19:45:02.042987581 +0000 UTC m=+1304.847828882" Apr 22 19:45:03.029164 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:03.029116 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 19:45:13.029899 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:13.029850 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 19:45:23.029540 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:23.029491 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 19:45:33.029852 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:33.029757 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 19:45:43.029343 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:43.029296 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 19:45:53.029681 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:45:53.029625 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 19:46:03.029397 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:03.029342 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 22 19:46:13.031029 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:13.030997 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" Apr 22 19:46:16.949242 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:16.949206 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87"] Apr 22 19:46:16.949698 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:16.949580 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="kserve-container" containerID="cri-o://a81d68e3e14396e842bdc0b9e94a28ba2bac1b45e31607a128e436a06d4bbd4e" gracePeriod=30 Apr 22 19:46:17.044224 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.044191 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk"] Apr 22 19:46:17.044542 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.044526 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af78fa9e-34be-4a42-9242-5feb37d1e364" containerName="storage-initializer" Apr 22 19:46:17.044587 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.044544 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="af78fa9e-34be-4a42-9242-5feb37d1e364" containerName="storage-initializer" Apr 22 19:46:17.044587 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.044557 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af78fa9e-34be-4a42-9242-5feb37d1e364" containerName="storage-initializer" Apr 22 19:46:17.044587 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.044563 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="af78fa9e-34be-4a42-9242-5feb37d1e364" containerName="storage-initializer" Apr 22 19:46:17.044680 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.044624 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="af78fa9e-34be-4a42-9242-5feb37d1e364" containerName="storage-initializer" Apr 22 19:46:17.044680 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.044632 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="af78fa9e-34be-4a42-9242-5feb37d1e364" containerName="storage-initializer" Apr 22 19:46:17.047870 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.047847 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" Apr 22 19:46:17.062351 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.062322 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk"] Apr 22 19:46:17.100025 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.099980 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2f9b4ea-4a98-4a95-8f49-4d294bf355c5-kserve-provision-location\") pod \"raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk\" (UID: \"c2f9b4ea-4a98-4a95-8f49-4d294bf355c5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" Apr 22 19:46:17.200803 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.200687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2f9b4ea-4a98-4a95-8f49-4d294bf355c5-kserve-provision-location\") pod \"raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk\" (UID: \"c2f9b4ea-4a98-4a95-8f49-4d294bf355c5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" Apr 22 19:46:17.201091 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.201070 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2f9b4ea-4a98-4a95-8f49-4d294bf355c5-kserve-provision-location\") pod \"raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk\" (UID: \"c2f9b4ea-4a98-4a95-8f49-4d294bf355c5\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" Apr 22 19:46:17.359164 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.359124 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" Apr 22 19:46:17.483418 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:17.483392 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk"] Apr 22 19:46:17.486003 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:46:17.485968 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2f9b4ea_4a98_4a95_8f49_4d294bf355c5.slice/crio-b9fd5505a9d1b220371062016287b2890482e202e37d409358fe434b7252fb39 WatchSource:0}: Error finding container b9fd5505a9d1b220371062016287b2890482e202e37d409358fe434b7252fb39: Status 404 returned error can't find the container with id b9fd5505a9d1b220371062016287b2890482e202e37d409358fe434b7252fb39 Apr 22 19:46:18.272118 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:18.272073 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" event={"ID":"c2f9b4ea-4a98-4a95-8f49-4d294bf355c5","Type":"ContainerStarted","Data":"0deef222da42e7b58a2dc38d09316f2a100b42f5da6247c47de876a5be935967"} Apr 22 19:46:18.272118 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:18.272116 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" event={"ID":"c2f9b4ea-4a98-4a95-8f49-4d294bf355c5","Type":"ContainerStarted","Data":"b9fd5505a9d1b220371062016287b2890482e202e37d409358fe434b7252fb39"} Apr 22 19:46:21.520671 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:21.520642 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" Apr 22 19:46:21.638923 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:21.638895 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00b8dce5-b230-4852-ad0b-e23ea55db2a3-kserve-provision-location\") pod \"00b8dce5-b230-4852-ad0b-e23ea55db2a3\" (UID: \"00b8dce5-b230-4852-ad0b-e23ea55db2a3\") " Apr 22 19:46:21.639245 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:21.639217 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00b8dce5-b230-4852-ad0b-e23ea55db2a3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "00b8dce5-b230-4852-ad0b-e23ea55db2a3" (UID: "00b8dce5-b230-4852-ad0b-e23ea55db2a3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:46:21.739429 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:21.739384 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00b8dce5-b230-4852-ad0b-e23ea55db2a3-kserve-provision-location\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:46:22.287639 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.287600 2579 generic.go:358] "Generic (PLEG): container finished" podID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerID="0deef222da42e7b58a2dc38d09316f2a100b42f5da6247c47de876a5be935967" exitCode=0 Apr 22 19:46:22.287852 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.287675 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" event={"ID":"c2f9b4ea-4a98-4a95-8f49-4d294bf355c5","Type":"ContainerDied","Data":"0deef222da42e7b58a2dc38d09316f2a100b42f5da6247c47de876a5be935967"} Apr 22 19:46:22.289221 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.289200 2579 generic.go:358] "Generic (PLEG): container finished" podID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerID="a81d68e3e14396e842bdc0b9e94a28ba2bac1b45e31607a128e436a06d4bbd4e" exitCode=0 Apr 22 19:46:22.289340 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.289245 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" event={"ID":"00b8dce5-b230-4852-ad0b-e23ea55db2a3","Type":"ContainerDied","Data":"a81d68e3e14396e842bdc0b9e94a28ba2bac1b45e31607a128e436a06d4bbd4e"} Apr 22 19:46:22.289340 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.289264 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" event={"ID":"00b8dce5-b230-4852-ad0b-e23ea55db2a3","Type":"ContainerDied","Data":"890817b5015b9c90f1f36745ae343561054eef90a3bfb724e005866ac93c77b1"} Apr 22 19:46:22.289340 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.289263 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87" Apr 22 19:46:22.289340 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.289280 2579 scope.go:117] "RemoveContainer" containerID="a81d68e3e14396e842bdc0b9e94a28ba2bac1b45e31607a128e436a06d4bbd4e" Apr 22 19:46:22.297380 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.297362 2579 scope.go:117] "RemoveContainer" containerID="4b3ddba252a882bc05c85b32078ed118f2e6d71a4a4a497b196070fbb23fb6c8" Apr 22 19:46:22.305680 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.305655 2579 scope.go:117] "RemoveContainer" containerID="a81d68e3e14396e842bdc0b9e94a28ba2bac1b45e31607a128e436a06d4bbd4e" Apr 22 19:46:22.306014 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:46:22.305990 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a81d68e3e14396e842bdc0b9e94a28ba2bac1b45e31607a128e436a06d4bbd4e\": container with ID starting with a81d68e3e14396e842bdc0b9e94a28ba2bac1b45e31607a128e436a06d4bbd4e not found: ID does not exist" containerID="a81d68e3e14396e842bdc0b9e94a28ba2bac1b45e31607a128e436a06d4bbd4e" Apr 22 19:46:22.306084 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.306023 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81d68e3e14396e842bdc0b9e94a28ba2bac1b45e31607a128e436a06d4bbd4e"} err="failed to get container status \"a81d68e3e14396e842bdc0b9e94a28ba2bac1b45e31607a128e436a06d4bbd4e\": rpc error: code = NotFound desc = could not find container \"a81d68e3e14396e842bdc0b9e94a28ba2bac1b45e31607a128e436a06d4bbd4e\": container with ID starting with a81d68e3e14396e842bdc0b9e94a28ba2bac1b45e31607a128e436a06d4bbd4e not found: ID does not exist" Apr 22 19:46:22.306084 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.306044 2579 scope.go:117] "RemoveContainer" containerID="4b3ddba252a882bc05c85b32078ed118f2e6d71a4a4a497b196070fbb23fb6c8" Apr 22 19:46:22.306351 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:46:22.306324 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b3ddba252a882bc05c85b32078ed118f2e6d71a4a4a497b196070fbb23fb6c8\": container with ID starting with 4b3ddba252a882bc05c85b32078ed118f2e6d71a4a4a497b196070fbb23fb6c8 not found: ID does not exist" containerID="4b3ddba252a882bc05c85b32078ed118f2e6d71a4a4a497b196070fbb23fb6c8" Apr 22 19:46:22.306472 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.306356 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b3ddba252a882bc05c85b32078ed118f2e6d71a4a4a497b196070fbb23fb6c8"} err="failed to get container status \"4b3ddba252a882bc05c85b32078ed118f2e6d71a4a4a497b196070fbb23fb6c8\": rpc error: code = NotFound desc = could not find container \"4b3ddba252a882bc05c85b32078ed118f2e6d71a4a4a497b196070fbb23fb6c8\": container with ID starting with 4b3ddba252a882bc05c85b32078ed118f2e6d71a4a4a497b196070fbb23fb6c8 not found: ID does not exist" Apr 22 19:46:22.316684 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.316654 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87"] Apr 22 19:46:22.320599 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:22.320574 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-942f9-predictor-8675645d6-74d87"] Apr 22 19:46:23.294895 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:23.294862 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" event={"ID":"c2f9b4ea-4a98-4a95-8f49-4d294bf355c5","Type":"ContainerStarted","Data":"b747a84114fb24f719edd5678e286e80cd2c8d98a724984f6de9686fb4f42311"} Apr 22 19:46:23.295271 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:23.295158 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" Apr 22 19:46:23.296566 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:23.296540 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 19:46:23.310749 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:23.310648 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" podStartSLOduration=6.310633249 podStartE2EDuration="6.310633249s" podCreationTimestamp="2026-04-22 19:46:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:46:23.31005401 +0000 UTC m=+1386.114895349" watchObservedRunningTime="2026-04-22 19:46:23.310633249 +0000 UTC m=+1386.115474552" Apr 22 19:46:23.805321 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:23.805288 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" path="/var/lib/kubelet/pods/00b8dce5-b230-4852-ad0b-e23ea55db2a3/volumes" Apr 22 19:46:24.298249 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:24.298213 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 19:46:34.298373 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:34.298322 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 19:46:44.298521 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:44.298469 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 19:46:54.298464 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:46:54.298419 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 19:47:04.298922 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:04.298824 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 19:47:14.298205 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:14.298156 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 19:47:24.298661 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:24.298607 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 19:47:27.803722 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:27.803677 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 19:47:37.805362 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:37.805330 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" Apr 22 19:47:47.409877 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:47.409831 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk"] Apr 22 19:47:47.410352 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:47.410162 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" containerID="cri-o://b747a84114fb24f719edd5678e286e80cd2c8d98a724984f6de9686fb4f42311" gracePeriod=30 Apr 22 19:47:47.803901 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:47.803798 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 22 19:47:51.955974 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:51.955949 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" Apr 22 19:47:52.009385 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.009307 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2f9b4ea-4a98-4a95-8f49-4d294bf355c5-kserve-provision-location\") pod \"c2f9b4ea-4a98-4a95-8f49-4d294bf355c5\" (UID: \"c2f9b4ea-4a98-4a95-8f49-4d294bf355c5\") " Apr 22 19:47:52.009660 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.009636 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f9b4ea-4a98-4a95-8f49-4d294bf355c5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" (UID: "c2f9b4ea-4a98-4a95-8f49-4d294bf355c5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:47:52.110656 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.110619 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2f9b4ea-4a98-4a95-8f49-4d294bf355c5-kserve-provision-location\") on node \"ip-10-0-133-159.ec2.internal\" DevicePath \"\"" Apr 22 19:47:52.565939 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.565903 2579 generic.go:358] "Generic (PLEG): container finished" podID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerID="b747a84114fb24f719edd5678e286e80cd2c8d98a724984f6de9686fb4f42311" exitCode=0 Apr 22 19:47:52.566227 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.565981 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" event={"ID":"c2f9b4ea-4a98-4a95-8f49-4d294bf355c5","Type":"ContainerDied","Data":"b747a84114fb24f719edd5678e286e80cd2c8d98a724984f6de9686fb4f42311"} Apr 22 19:47:52.566227 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.566008 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" Apr 22 19:47:52.566227 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.566019 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk" event={"ID":"c2f9b4ea-4a98-4a95-8f49-4d294bf355c5","Type":"ContainerDied","Data":"b9fd5505a9d1b220371062016287b2890482e202e37d409358fe434b7252fb39"} Apr 22 19:47:52.566227 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.566034 2579 scope.go:117] "RemoveContainer" containerID="b747a84114fb24f719edd5678e286e80cd2c8d98a724984f6de9686fb4f42311" Apr 22 19:47:52.574942 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.574924 2579 scope.go:117] "RemoveContainer" containerID="0deef222da42e7b58a2dc38d09316f2a100b42f5da6247c47de876a5be935967" Apr 22 19:47:52.582216 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.582198 2579 scope.go:117] "RemoveContainer" containerID="b747a84114fb24f719edd5678e286e80cd2c8d98a724984f6de9686fb4f42311" Apr 22 19:47:52.582472 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:47:52.582450 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b747a84114fb24f719edd5678e286e80cd2c8d98a724984f6de9686fb4f42311\": container with ID starting with b747a84114fb24f719edd5678e286e80cd2c8d98a724984f6de9686fb4f42311 not found: ID does not exist" containerID="b747a84114fb24f719edd5678e286e80cd2c8d98a724984f6de9686fb4f42311" Apr 22 19:47:52.582550 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.582478 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b747a84114fb24f719edd5678e286e80cd2c8d98a724984f6de9686fb4f42311"} err="failed to get container status \"b747a84114fb24f719edd5678e286e80cd2c8d98a724984f6de9686fb4f42311\": rpc error: code = NotFound desc = could not find container \"b747a84114fb24f719edd5678e286e80cd2c8d98a724984f6de9686fb4f42311\": container with ID starting with b747a84114fb24f719edd5678e286e80cd2c8d98a724984f6de9686fb4f42311 not found: ID does not exist" Apr 22 19:47:52.582550 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.582493 2579 scope.go:117] "RemoveContainer" containerID="0deef222da42e7b58a2dc38d09316f2a100b42f5da6247c47de876a5be935967" Apr 22 19:47:52.582798 ip-10-0-133-159 kubenswrapper[2579]: E0422 19:47:52.582717 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0deef222da42e7b58a2dc38d09316f2a100b42f5da6247c47de876a5be935967\": container with ID starting with 0deef222da42e7b58a2dc38d09316f2a100b42f5da6247c47de876a5be935967 not found: ID does not exist" containerID="0deef222da42e7b58a2dc38d09316f2a100b42f5da6247c47de876a5be935967" Apr 22 19:47:52.582798 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.582771 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0deef222da42e7b58a2dc38d09316f2a100b42f5da6247c47de876a5be935967"} err="failed to get container status \"0deef222da42e7b58a2dc38d09316f2a100b42f5da6247c47de876a5be935967\": rpc error: code = NotFound desc = could not find container \"0deef222da42e7b58a2dc38d09316f2a100b42f5da6247c47de876a5be935967\": container with ID starting with 0deef222da42e7b58a2dc38d09316f2a100b42f5da6247c47de876a5be935967 not found: ID does not exist" Apr 22 19:47:52.589033 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.589010 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk"] Apr 22 19:47:52.592962 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:52.592941 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-1b8f0-predictor-77d78885c4-bk6xk"] Apr 22 19:47:53.805699 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:47:53.805660 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" path="/var/lib/kubelet/pods/c2f9b4ea-4a98-4a95-8f49-4d294bf355c5/volumes" Apr 22 19:48:12.945363 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.945326 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-skx6f/must-gather-lcp9h"] Apr 22 19:48:12.945967 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.945664 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="storage-initializer" Apr 22 19:48:12.945967 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.945676 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="storage-initializer" Apr 22 19:48:12.945967 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.945694 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="storage-initializer" Apr 22 19:48:12.945967 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.945699 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="storage-initializer" Apr 22 19:48:12.945967 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.945707 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="kserve-container" Apr 22 19:48:12.945967 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.945713 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="kserve-container" Apr 22 19:48:12.945967 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.945720 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" Apr 22 19:48:12.945967 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.945740 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" Apr 22 19:48:12.945967 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.945809 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2f9b4ea-4a98-4a95-8f49-4d294bf355c5" containerName="kserve-container" Apr 22 19:48:12.945967 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.945819 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="00b8dce5-b230-4852-ad0b-e23ea55db2a3" containerName="kserve-container" Apr 22 19:48:12.948651 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.948633 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-skx6f/must-gather-lcp9h" Apr 22 19:48:12.951471 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.951451 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-skx6f\"/\"default-dockercfg-6vwdj\"" Apr 22 19:48:12.951832 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.951815 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-skx6f\"/\"kube-root-ca.crt\"" Apr 22 19:48:12.951879 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.951864 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-skx6f\"/\"openshift-service-ca.crt\"" Apr 22 19:48:12.962081 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:12.962052 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-skx6f/must-gather-lcp9h"] Apr 22 19:48:13.096599 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:13.096560 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb-must-gather-output\") pod \"must-gather-lcp9h\" (UID: \"fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb\") " pod="openshift-must-gather-skx6f/must-gather-lcp9h" Apr 22 19:48:13.096599 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:13.096602 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn55x\" (UniqueName: \"kubernetes.io/projected/fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb-kube-api-access-rn55x\") pod \"must-gather-lcp9h\" (UID: \"fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb\") " pod="openshift-must-gather-skx6f/must-gather-lcp9h" Apr 22 19:48:13.197829 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:13.197696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb-must-gather-output\") pod \"must-gather-lcp9h\" (UID: \"fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb\") " pod="openshift-must-gather-skx6f/must-gather-lcp9h" Apr 22 19:48:13.197829 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:13.197778 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn55x\" (UniqueName: \"kubernetes.io/projected/fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb-kube-api-access-rn55x\") pod \"must-gather-lcp9h\" (UID: \"fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb\") " pod="openshift-must-gather-skx6f/must-gather-lcp9h" Apr 22 19:48:13.198059 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:13.198039 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb-must-gather-output\") pod \"must-gather-lcp9h\" (UID: \"fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb\") " pod="openshift-must-gather-skx6f/must-gather-lcp9h" Apr 22 19:48:13.207270 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:13.207228 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn55x\" (UniqueName: \"kubernetes.io/projected/fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb-kube-api-access-rn55x\") pod \"must-gather-lcp9h\" (UID: \"fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb\") " pod="openshift-must-gather-skx6f/must-gather-lcp9h" Apr 22 19:48:13.276329 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:13.276288 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-skx6f/must-gather-lcp9h" Apr 22 19:48:13.399325 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:13.399293 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-skx6f/must-gather-lcp9h"] Apr 22 19:48:13.402143 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:48:13.402111 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdde6bb9_9451_4c02_bac6_5e1ca61e0ebb.slice/crio-55756161ff1463bb219160edc507b7427e98dbd878e7cf84602fec26d6c9aa2b WatchSource:0}: Error finding container 55756161ff1463bb219160edc507b7427e98dbd878e7cf84602fec26d6c9aa2b: Status 404 returned error can't find the container with id 55756161ff1463bb219160edc507b7427e98dbd878e7cf84602fec26d6c9aa2b Apr 22 19:48:13.630645 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:13.630610 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-skx6f/must-gather-lcp9h" event={"ID":"fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb","Type":"ContainerStarted","Data":"55756161ff1463bb219160edc507b7427e98dbd878e7cf84602fec26d6c9aa2b"} Apr 22 19:48:14.636498 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:14.636450 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-skx6f/must-gather-lcp9h" event={"ID":"fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb","Type":"ContainerStarted","Data":"957f08c0a80436ea7013f8132c492c80f21c8fc5972f792f07c43122067b33fb"} Apr 22 19:48:14.636498 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:14.636504 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-skx6f/must-gather-lcp9h" event={"ID":"fdde6bb9-9451-4c02-bac6-5e1ca61e0ebb","Type":"ContainerStarted","Data":"15ff5a60bd87b6be8db4e76bbc50eefb59af24950621da86968b6a930e7542ed"} Apr 22 19:48:14.654147 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:14.654080 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-skx6f/must-gather-lcp9h" podStartSLOduration=1.841365228 podStartE2EDuration="2.654061353s" podCreationTimestamp="2026-04-22 19:48:12 +0000 UTC" firstStartedPulling="2026-04-22 19:48:13.404322746 +0000 UTC m=+1496.209164028" lastFinishedPulling="2026-04-22 19:48:14.217018865 +0000 UTC m=+1497.021860153" observedRunningTime="2026-04-22 19:48:14.653346693 +0000 UTC m=+1497.458188005" watchObservedRunningTime="2026-04-22 19:48:14.654061353 +0000 UTC m=+1497.458902660" Apr 22 19:48:15.814555 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:15.814521 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kjl96_ffc97da9-d997-4214-bf06-cbdb4a551c74/global-pull-secret-syncer/0.log" Apr 22 19:48:15.991181 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:15.991152 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rpmcm_76a63f32-8306-496d-ab47-f0ec1293937f/konnectivity-agent/0.log" Apr 22 19:48:16.051086 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:16.051050 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-159.ec2.internal_b82df689b1e19e43cfff5d00b46485d2/haproxy/0.log" Apr 22 19:48:17.846751 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:17.846568 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:48:17.849703 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:17.848610 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:48:19.597450 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:19.597359 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ad0a23d8-c6df-4985-ae22-96cdc1d30ef5/alertmanager/0.log" Apr 22 19:48:19.625202 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:19.625107 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ad0a23d8-c6df-4985-ae22-96cdc1d30ef5/config-reloader/0.log" Apr 22 19:48:19.657682 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:19.657648 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ad0a23d8-c6df-4985-ae22-96cdc1d30ef5/kube-rbac-proxy-web/0.log" Apr 22 19:48:19.691140 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:19.691110 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ad0a23d8-c6df-4985-ae22-96cdc1d30ef5/kube-rbac-proxy/0.log" Apr 22 19:48:19.726197 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:19.726170 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ad0a23d8-c6df-4985-ae22-96cdc1d30ef5/kube-rbac-proxy-metric/0.log" Apr 22 19:48:19.755804 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:19.755773 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ad0a23d8-c6df-4985-ae22-96cdc1d30ef5/prom-label-proxy/0.log" Apr 22 19:48:19.791629 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:19.791598 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ad0a23d8-c6df-4985-ae22-96cdc1d30ef5/init-config-reloader/0.log" Apr 22 19:48:19.838610 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:19.838549 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-z82sr_a0018d13-5e39-40ec-a8e1-bc62c3aeee0a/cluster-monitoring-operator/0.log" Apr 22 19:48:19.865596 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:19.865505 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-srrrb_59ed8a36-517d-40fc-b340-cfe4a80582da/kube-state-metrics/0.log" Apr 22 19:48:19.896230 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:19.896175 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-srrrb_59ed8a36-517d-40fc-b340-cfe4a80582da/kube-rbac-proxy-main/0.log" Apr 22 19:48:19.927793 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:19.927756 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-srrrb_59ed8a36-517d-40fc-b340-cfe4a80582da/kube-rbac-proxy-self/0.log" Apr 22 19:48:20.210981 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:20.210890 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pq9pg_9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad/node-exporter/0.log" Apr 22 19:48:20.237696 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:20.237667 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pq9pg_9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad/kube-rbac-proxy/0.log" Apr 22 19:48:20.264755 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:20.264698 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pq9pg_9aa2bf52-4c93-4fd4-9f9b-78fa40aa6fad/init-textfile/0.log" Apr 22 19:48:20.295488 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:20.295451 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-mqnsf_a85b69a4-76cb-4a0b-aea8-8638b5db8f71/kube-rbac-proxy-main/0.log" Apr 22 19:48:20.322000 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:20.321968 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-mqnsf_a85b69a4-76cb-4a0b-aea8-8638b5db8f71/kube-rbac-proxy-self/0.log" Apr 22 19:48:20.351173 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:20.351144 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-mqnsf_a85b69a4-76cb-4a0b-aea8-8638b5db8f71/openshift-state-metrics/0.log" Apr 22 19:48:20.734302 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:20.733622 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-66495fb49d-kb29w_434a8f7d-a89d-4e80-8940-ec8a6154255c/telemeter-client/0.log" Apr 22 19:48:20.773776 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:20.773710 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-66495fb49d-kb29w_434a8f7d-a89d-4e80-8940-ec8a6154255c/reload/0.log" Apr 22 19:48:20.818207 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:20.818121 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-66495fb49d-kb29w_434a8f7d-a89d-4e80-8940-ec8a6154255c/kube-rbac-proxy/0.log" Apr 22 19:48:22.902743 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:22.902699 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd"] Apr 22 19:48:22.906856 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:22.906829 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:22.921496 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:22.921466 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd"] Apr 22 19:48:22.972334 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:22.972295 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d979f574c-9pjz9_f520a25a-85c4-4b6b-934e-0fa9933832ac/console/0.log" Apr 22 19:48:22.995633 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:22.995594 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0efa4fff-c4f3-4238-9847-26dda04d439c-proc\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:22.995850 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:22.995717 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0efa4fff-c4f3-4238-9847-26dda04d439c-sys\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:22.995850 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:22.995803 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0efa4fff-c4f3-4238-9847-26dda04d439c-podres\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:22.995988 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:22.995882 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7kzf\" (UniqueName: \"kubernetes.io/projected/0efa4fff-c4f3-4238-9847-26dda04d439c-kube-api-access-k7kzf\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:22.995988 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:22.995912 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0efa4fff-c4f3-4238-9847-26dda04d439c-lib-modules\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:23.096510 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.096472 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0efa4fff-c4f3-4238-9847-26dda04d439c-podres\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:23.096699 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.096530 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7kzf\" (UniqueName: \"kubernetes.io/projected/0efa4fff-c4f3-4238-9847-26dda04d439c-kube-api-access-k7kzf\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:23.096699 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.096554 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0efa4fff-c4f3-4238-9847-26dda04d439c-lib-modules\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:23.096699 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.096587 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0efa4fff-c4f3-4238-9847-26dda04d439c-proc\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:23.096699 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.096637 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0efa4fff-c4f3-4238-9847-26dda04d439c-sys\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:23.096699 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.096645 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0efa4fff-c4f3-4238-9847-26dda04d439c-podres\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:23.096930 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.096707 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0efa4fff-c4f3-4238-9847-26dda04d439c-proc\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:23.096930 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.096756 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0efa4fff-c4f3-4238-9847-26dda04d439c-lib-modules\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:23.096930 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.096766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0efa4fff-c4f3-4238-9847-26dda04d439c-sys\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:23.106103 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.106068 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7kzf\" (UniqueName: \"kubernetes.io/projected/0efa4fff-c4f3-4238-9847-26dda04d439c-kube-api-access-k7kzf\") pod \"perf-node-gather-daemonset-rfzhd\" (UID: \"0efa4fff-c4f3-4238-9847-26dda04d439c\") " pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:23.218878 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.218777 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:23.387625 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.387591 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd"] Apr 22 19:48:23.387857 ip-10-0-133-159 kubenswrapper[2579]: W0422 19:48:23.387826 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0efa4fff_c4f3_4238_9847_26dda04d439c.slice/crio-0211e9c0ea5cd1f468983d9a742b6204deeb20e878e09ee9c37438d6c83775d9 WatchSource:0}: Error finding container 0211e9c0ea5cd1f468983d9a742b6204deeb20e878e09ee9c37438d6c83775d9: Status 404 returned error can't find the container with id 0211e9c0ea5cd1f468983d9a742b6204deeb20e878e09ee9c37438d6c83775d9 Apr 22 19:48:23.485194 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.485121 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-82bbd_0146a304-7dc4-4714-9289-ca9e3f151e55/volume-data-source-validator/0.log" Apr 22 19:48:23.673031 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.672996 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" event={"ID":"0efa4fff-c4f3-4238-9847-26dda04d439c","Type":"ContainerStarted","Data":"eb5d75d5f706d7638670526758d27cc0003afeb74b070407511eed854d401653"} Apr 22 19:48:23.673031 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.673033 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" event={"ID":"0efa4fff-c4f3-4238-9847-26dda04d439c","Type":"ContainerStarted","Data":"0211e9c0ea5cd1f468983d9a742b6204deeb20e878e09ee9c37438d6c83775d9"} Apr 22 19:48:23.673275 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.673123 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:23.696143 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:23.696061 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" podStartSLOduration=1.696039947 podStartE2EDuration="1.696039947s" podCreationTimestamp="2026-04-22 19:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:48:23.69039773 +0000 UTC m=+1506.495239036" watchObservedRunningTime="2026-04-22 19:48:23.696039947 +0000 UTC m=+1506.500881253" Apr 22 19:48:24.321586 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:24.321547 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nrq92_4cbce8ae-11e1-44fb-a76c-a617c14a01cb/dns/0.log" Apr 22 19:48:24.347069 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:24.347042 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nrq92_4cbce8ae-11e1-44fb-a76c-a617c14a01cb/kube-rbac-proxy/0.log" Apr 22 19:48:24.377242 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:24.377211 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5bh88_73366f6c-dc1e-4c5b-a1f3-e3d7839a351e/dns-node-resolver/0.log" Apr 22 19:48:24.891360 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:24.891322 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-c9898b8d8-4dftw_463b4c94-dbcc-4e22-8c34-c3a19a07bc68/registry/0.log" Apr 22 19:48:24.982793 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:24.982762 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-drfzw_54216124-3633-4740-9592-06c935cb0781/node-ca/0.log" Apr 22 19:48:25.758429 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:25.758386 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-55dd5f6bfb-9bmd5_da14fcd3-4263-4cf2-abf8-bdaccee3e441/router/0.log" Apr 22 19:48:26.154784 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:26.154758 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b42mv_db70a090-7023-443d-b909-09cc5a489c13/serve-healthcheck-canary/0.log" Apr 22 19:48:26.635150 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:26.635106 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-g5jd8_c6aa8798-c003-4836-b89b-2ec659893918/insights-operator/0.log" Apr 22 19:48:26.635414 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:26.635379 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-g5jd8_c6aa8798-c003-4836-b89b-2ec659893918/insights-operator/1.log" Apr 22 19:48:26.861866 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:26.861832 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x8ngq_ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef/kube-rbac-proxy/0.log" Apr 22 19:48:26.892504 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:26.892416 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x8ngq_ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef/exporter/0.log" Apr 22 19:48:26.918969 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:26.918941 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x8ngq_ebddd58c-975e-4d0e-ab4c-b9a7564ac2ef/extractor/0.log" Apr 22 19:48:28.931862 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:28.931823 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-545d8995fb-7258g_f5adafa5-fc01-4bc5-8639-c47908aec837/manager/0.log" Apr 22 19:48:29.007905 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:29.007874 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-bq8rn_160df6da-5e63-48cf-bf25-63821f86465a/server/0.log" Apr 22 19:48:29.214812 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:29.214710 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-hv5pj_7e60006c-826d-4997-9ffc-e6d055a80ad6/seaweedfs/0.log" Apr 22 19:48:29.686790 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:29.686751 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-skx6f/perf-node-gather-daemonset-rfzhd" Apr 22 19:48:33.679566 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:33.679537 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wzvx6_a21b65b1-0841-47bb-a561-5ca5bb8578a0/migrator/0.log" Apr 22 19:48:33.711969 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:33.711939 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wzvx6_a21b65b1-0841-47bb-a561-5ca5bb8578a0/graceful-termination/0.log" Apr 22 19:48:35.283437 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:35.283405 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzp7f_1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d/kube-multus-additional-cni-plugins/0.log" Apr 22 19:48:35.310546 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:35.310511 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzp7f_1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d/egress-router-binary-copy/0.log" Apr 22 19:48:35.336853 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:35.336820 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzp7f_1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d/cni-plugins/0.log" Apr 22 19:48:35.365299 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:35.365273 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzp7f_1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d/bond-cni-plugin/0.log" Apr 22 19:48:35.389913 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:35.389886 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzp7f_1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d/routeoverride-cni/0.log" Apr 22 19:48:35.420723 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:35.420695 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzp7f_1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d/whereabouts-cni-bincopy/0.log" Apr 22 19:48:35.447217 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:35.447188 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzp7f_1cfa1b6f-8796-4d69-9c36-cd2bfdc2280d/whereabouts-cni/0.log" Apr 22 19:48:35.768565 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:35.768515 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t8zwd_04dcf06d-ab97-49fd-b8b0-d5036c249ae1/kube-multus/0.log" Apr 22 19:48:35.922431 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:35.922403 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qwbg8_9c0f6922-8799-4caa-adfb-fa958fee9291/network-metrics-daemon/0.log" Apr 22 19:48:35.953304 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:35.953273 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qwbg8_9c0f6922-8799-4caa-adfb-fa958fee9291/kube-rbac-proxy/0.log" Apr 22 19:48:36.812794 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:36.812767 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-controller/0.log" Apr 22 19:48:36.842950 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:36.842919 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/0.log" Apr 22 19:48:36.849894 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:36.849853 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovn-acl-logging/1.log" Apr 22 19:48:36.876691 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:36.876663 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/kube-rbac-proxy-node/0.log" Apr 22 19:48:36.905617 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:36.905586 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:48:36.936817 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:36.936792 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/northd/0.log" Apr 22 19:48:36.969775 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:36.969747 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/nbdb/0.log" Apr 22 19:48:37.002955 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:37.002923 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/sbdb/0.log" Apr 22 19:48:37.117947 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:37.117887 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2crp2_9a484ef5-ac14-4ff2-ab99-82238200be07/ovnkube-controller/0.log" Apr 22 19:48:38.956178 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:38.956133 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xkr7v_5037e972-6e10-4b21-bde5-a072bf744013/network-check-target-container/0.log" Apr 22 19:48:39.960282 ip-10-0-133-159 kubenswrapper[2579]: I0422 19:48:39.960181 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-tjgwl_e451824a-2133-4364-b91f-8b08929198a3/iptables-alerter/0.log"