Apr 16 22:13:36.944846 ip-10-0-129-102 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:13:37.412414 ip-10-0-129-102 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:37.412414 ip-10-0-129-102 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:13:37.412414 ip-10-0-129-102 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:37.412414 ip-10-0-129-102 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:13:37.412414 ip-10-0-129-102 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:37.414051 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.413935 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:13:37.417003 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.416989 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:37.417003 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417002 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417006 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417010 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417013 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417017 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417020 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417023 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417026 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417033 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417036 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417039 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417041 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417044 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417047 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417049 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417053 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417057 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417060 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417063 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:37.417065 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417067 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417070 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417073 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417076 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417078 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417081 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417084 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417086 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417089 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417091 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417094 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417097 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417101 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417104 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417107 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417110 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417112 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417115 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417118 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:37.417515 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417120 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417123 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417125 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417128 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417130 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417133 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417135 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417137 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417140 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417142 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417145 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417147 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417150 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417152 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417155 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417158 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417161 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417163 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417166 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417169 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:37.418015 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417171 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417174 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417176 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417179 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417181 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417184 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417186 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417194 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417197 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417200 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417202 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417205 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417207 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417210 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417213 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417215 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417219 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417221 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417224 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417226 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:37.418491 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417230 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417233 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417235 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417238 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417240 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417243 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.417245 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418864 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418871 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418874 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418877 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418880 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418882 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418886 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418888 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418891 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418893 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418896 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418898 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:37.418995 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418906 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418910 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418914 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418916 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418919 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418922 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418924 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418927 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418930 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418932 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418935 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418937 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418940 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418943 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418945 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418948 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418951 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418955 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418957 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418960 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:37.419444 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418963 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418965 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418968 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418970 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418973 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418975 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418978 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418981 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418983 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418986 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418990 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418993 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.418997 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419003 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419006 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419009 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419012 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419014 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419016 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:37.419942 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419019 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419022 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419024 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419027 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419029 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419032 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419035 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419039 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419042 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419044 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419047 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419049 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419052 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419054 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419057 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419059 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419062 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419065 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419067 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419069 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:37.420398 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419072 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419074 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419077 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419079 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419082 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419084 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419087 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419090 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419093 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419095 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419098 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419101 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419103 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419106 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419109 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419173 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419180 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419194 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419200 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419207 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419211 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:13:37.420911 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419218 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419222 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419226 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419228 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419232 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419235 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419238 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419241 2571 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419244 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419247 2571 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419250 2571 flags.go:64] FLAG: --cloud-config="" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419252 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419255 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419260 2571 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419263 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419266 2571 flags.go:64] FLAG: --config-dir="" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419270 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419273 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419277 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419281 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419284 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419287 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419290 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419293 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:13:37.421420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419296 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419299 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419302 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419306 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419309 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419312 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419314 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419318 2571 flags.go:64] FLAG: --enable-server="true" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419320 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419324 2571 flags.go:64] FLAG: --event-burst="100" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419327 2571 flags.go:64] FLAG: --event-qps="50" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419330 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419333 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419336 2571 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419340 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419342 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419345 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419348 2571 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419351 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419353 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419356 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419359 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419362 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419364 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419367 2571 flags.go:64] FLAG: --feature-gates="" Apr 16 22:13:37.422029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419371 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419374 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419377 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419381 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419384 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419387 2571 flags.go:64] FLAG: --help="false" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419390 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419393 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419396 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419399 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419403 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419406 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419409 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419412 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419415 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419417 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419420 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419423 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419426 2571 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419429 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419431 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419435 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419437 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419440 2571 flags.go:64] FLAG: --lock-file="" Apr 16 22:13:37.422648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419443 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419446 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419449 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419454 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419457 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419460 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419463 2571 flags.go:64] FLAG: --logging-format="text" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419466 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419470 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419473 2571 flags.go:64] FLAG: --manifest-url="" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419475 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419480 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419486 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419491 2571 flags.go:64] FLAG: --max-pods="110" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419494 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419497 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419499 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419502 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419505 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419508 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419511 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419518 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419521 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419524 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:13:37.423216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419527 2571 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419530 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419536 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419539 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419542 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419545 2571 flags.go:64] FLAG: --port="10250" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419560 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419564 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0cd599461b4c45aa3" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419567 2571 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419571 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419574 2571 flags.go:64] FLAG: --register-node="true" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419576 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419579 2571 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419583 2571 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419586 2571 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419589 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419592 2571 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419595 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419598 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419602 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419605 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419608 2571 flags.go:64] FLAG: --runonce="false" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419611 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419614 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419617 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:13:37.423846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419620 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419623 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419626 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419629 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419632 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419635 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419638 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419640 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419643 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419646 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419649 2571 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419652 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419657 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419660 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419663 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419667 2571 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419670 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419673 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419676 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419678 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419681 2571 flags.go:64] FLAG: --v="2" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419686 2571 flags.go:64] FLAG: --version="false" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419691 2571 flags.go:64] FLAG: --vmodule="" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419696 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.419699 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:13:37.424448 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419792 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419795 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419798 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419802 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419808 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419811 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419814 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419817 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419820 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419822 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419825 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419828 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419831 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419833 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419836 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419838 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419841 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419844 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419846 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419849 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:37.425108 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419852 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419854 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419857 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419860 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419862 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419865 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419867 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419870 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419872 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419876 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419879 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419881 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419884 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419887 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419889 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419892 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419895 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419899 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419901 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419904 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:37.425622 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419906 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419909 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419912 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419914 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419917 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419919 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419922 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419924 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419927 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419930 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419932 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419934 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419937 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419939 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419942 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419945 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419947 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419950 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419953 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419955 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:37.426114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419958 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419962 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419966 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419969 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419972 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419975 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419977 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419980 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419984 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419986 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419989 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419992 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419994 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.419997 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.420001 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.420004 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.420007 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.420010 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.420013 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:37.426608 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.420016 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:37.427074 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.420018 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:37.427074 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.420021 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:37.427074 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.420023 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:37.427074 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.420026 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:37.427074 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.420029 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:37.427074 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.420031 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:37.427074 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.420705 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.427147 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.427163 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427219 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427224 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427228 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427234 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427237 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427240 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427243 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427247 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427250 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427253 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427256 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427259 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:37.427255 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427262 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427265 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427268 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427271 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427274 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427276 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427279 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427281 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427284 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427287 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427289 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427292 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427294 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427297 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427299 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427302 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427304 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427307 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427310 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427313 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:37.427639 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427315 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427318 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427320 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427323 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427325 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427328 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427330 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427333 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427335 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427338 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427340 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427343 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427345 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427348 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427351 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427354 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427357 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427360 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427362 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:37.428157 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427365 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427367 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427370 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427373 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427375 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427378 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427380 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427382 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427385 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427387 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427390 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427392 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427395 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427398 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427400 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427403 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427406 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427410 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427414 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427416 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:37.428666 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427419 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427421 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427424 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427426 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427429 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427432 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427435 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427437 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427440 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427443 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427445 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427448 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427451 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427453 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427456 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.427461 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:37.429152 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427583 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427589 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427592 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427595 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427598 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427601 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427603 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427606 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427609 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427611 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427615 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427617 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427620 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427622 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427625 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427628 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427631 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427633 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427635 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427638 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:37.429545 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427640 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427644 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427648 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427651 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427654 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427656 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427659 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427661 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427664 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427666 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427668 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427671 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427674 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427676 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427678 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427681 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427683 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427686 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427690 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:37.430037 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427693 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427695 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427698 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427702 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427705 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427707 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427710 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427713 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427715 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427718 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427720 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427722 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427725 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427727 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427730 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427734 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427736 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427739 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427742 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427744 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:37.430604 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427746 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427749 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427751 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427754 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427756 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427759 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427761 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427764 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427766 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427769 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427771 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427774 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427776 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427779 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427782 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427786 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427788 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427791 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427793 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:37.431360 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427796 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:37.431894 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427798 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:37.431894 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427801 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:37.431894 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427803 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:37.431894 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427806 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:37.431894 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427808 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:37.431894 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427811 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:37.431894 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:37.427813 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:37.431894 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.427818 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:37.431894 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.429274 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:13:37.433893 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.433872 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:13:37.434989 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.434975 2571 server.go:1019] "Starting client certificate rotation" Apr 16 22:13:37.435085 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.435068 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:37.435119 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.435110 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:37.460392 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.460373 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:37.462950 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.462930 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:37.481485 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.481465 2571 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:13:37.488788 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.488773 2571 log.go:25] "Validated CRI v1 image API" Apr 16 22:13:37.490447 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.490429 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:13:37.491247 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.491228 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:37.494960 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.494932 2571 fs.go:135] Filesystem UUIDs: map[630b8320-9c00-4df2-a3ff-8403a4187cd9:/dev/nvme0n1p3 72da38a9-662e-45cf-977a-6127ceef225e:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 22:13:37.495017 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.494963 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:13:37.501482 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.501370 2571 manager.go:217] Machine: {Timestamp:2026-04-16 22:13:37.499459464 +0000 UTC m=+0.430784317 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100115 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec269e648077ebb8578665f0aeb27397 SystemUUID:ec269e64-8077-ebb8-5786-65f0aeb27397 BootID:a4b11270-e617-4130-b81d-1449ddc893f2 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e5:fa:7f:4e:e3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e5:fa:7f:4e:e3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:d2:51:03:fe:ca:58 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:13:37.501482 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.501471 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:13:37.501633 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.501544 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:13:37.502575 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.502543 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:13:37.502706 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.502577 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-102.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:13:37.502757 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.502715 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:13:37.502757 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.502724 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:13:37.502757 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.502736 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:37.503570 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.503559 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:37.504402 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.504392 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:37.504534 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.504524 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:13:37.507092 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.507077 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:13:37.507132 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.507097 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:13:37.507132 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.507107 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:13:37.507132 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.507116 2571 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:13:37.507132 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.507127 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:13:37.508152 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.508137 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:37.508225 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.508157 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:37.511240 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.511224 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:13:37.512426 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.512411 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:13:37.514732 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.514720 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:13:37.514784 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.514736 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:13:37.514784 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.514742 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:13:37.514784 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.514748 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:13:37.514784 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.514754 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:13:37.514784 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.514760 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:13:37.514784 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.514765 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:13:37.514784 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.514771 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:13:37.514784 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.514778 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:13:37.514784 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.514784 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:13:37.515014 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.514794 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:13:37.515014 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.514803 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:13:37.515824 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.515814 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:13:37.515857 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.515824 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:13:37.519345 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.519331 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:13:37.519401 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.519375 2571 server.go:1295] "Started kubelet" Apr 16 22:13:37.519496 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.519472 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:13:37.519546 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.519471 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:13:37.519546 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.519529 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:13:37.520037 ip-10-0-129-102 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:13:37.520649 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.520579 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:13:37.522068 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.522052 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:13:37.522779 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.522760 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-102.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 22:13:37.523107 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.523085 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 22:13:37.523153 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.523087 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-102.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 22:13:37.526975 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.526959 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:37.527580 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.527564 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:13:37.528187 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.528170 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:13:37.528297 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.528191 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:13:37.528297 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.526854 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-102.ec2.internal.18a6f6080f31fc05 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-102.ec2.internal,UID:ip-10-0-129-102.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-102.ec2.internal,},FirstTimestamp:2026-04-16 22:13:37.519344645 +0000 UTC m=+0.450669498,LastTimestamp:2026-04-16 22:13:37.519344645 +0000 UTC m=+0.450669498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-102.ec2.internal,}" Apr 16 22:13:37.528297 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.528238 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:13:37.528456 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.528307 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:13:37.528456 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.528316 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:13:37.528456 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.528446 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:13:37.528633 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.528458 2571 factory.go:55] Registering systemd factory Apr 16 22:13:37.528633 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.528468 2571 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:13:37.528633 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.528507 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 16 22:13:37.530030 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.530012 2571 factory.go:153] Registering CRI-O factory Apr 16 22:13:37.530030 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.530031 2571 factory.go:223] Registration of the crio container factory successfully Apr 16 22:13:37.530185 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.530054 2571 factory.go:103] Registering Raw factory Apr 16 22:13:37.530185 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.530070 2571 manager.go:1196] Started watching for new ooms in manager Apr 16 22:13:37.530876 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.530857 2571 manager.go:319] Starting recovery of all containers Apr 16 22:13:37.531156 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.531134 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:13:37.532711 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.532684 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 22:13:37.532926 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.532902 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-102.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 22:13:37.535976 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.535953 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pf542" Apr 16 22:13:37.542577 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.542560 2571 manager.go:324] Recovery completed Apr 16 22:13:37.544104 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.544087 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pf542" Apr 16 22:13:37.546559 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.546538 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:37.549057 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.549039 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:37.549146 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.549071 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:37.549146 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.549085 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:37.549511 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.549495 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:13:37.549511 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.549509 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:13:37.549632 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.549522 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:37.551082 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.551023 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-102.ec2.internal.18a6f60810f755ce default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-102.ec2.internal,UID:ip-10-0-129-102.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-102.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-102.ec2.internal,},FirstTimestamp:2026-04-16 22:13:37.549055438 +0000 UTC m=+0.480380291,LastTimestamp:2026-04-16 22:13:37.549055438 +0000 UTC m=+0.480380291,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-102.ec2.internal,}" Apr 16 22:13:37.551849 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.551837 2571 policy_none.go:49] "None policy: Start" Apr 16 22:13:37.551886 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.551854 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:13:37.551886 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.551864 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:13:37.588200 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.588074 2571 manager.go:341] "Starting Device Plugin manager" Apr 16 22:13:37.592412 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.588221 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:13:37.592412 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.588234 2571 server.go:85] "Starting device plugin registration server" Apr 16 22:13:37.592412 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.588417 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:13:37.592412 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.588427 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:13:37.592412 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.588485 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:13:37.592412 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.588577 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:13:37.592412 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.588583 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:13:37.592412 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.589116 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:13:37.592412 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.589152 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-102.ec2.internal\" not found" Apr 16 22:13:37.672722 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.672664 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:13:37.673862 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.673838 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:13:37.673945 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.673865 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:13:37.673945 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.673881 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:13:37.673945 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.673889 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:13:37.673945 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.673926 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:13:37.676066 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.676051 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:37.688688 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.688674 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:37.689419 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.689405 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:37.689466 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.689432 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:37.689466 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.689443 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:37.689466 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.689464 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.697666 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.697653 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.697708 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.697673 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-102.ec2.internal\": node \"ip-10-0-129-102.ec2.internal\" not found" Apr 16 22:13:37.713706 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.713687 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 16 22:13:37.774757 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.774738 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal"] Apr 16 22:13:37.774822 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.774812 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:37.776113 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.776101 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:37.776165 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.776125 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:37.776165 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.776135 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:37.777171 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.777159 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:37.777293 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.777281 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.777326 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.777308 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:37.777831 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.777814 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:37.777921 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.777843 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:37.777921 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.777824 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:37.777921 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.777856 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:37.777921 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.777871 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:37.777921 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.777883 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:37.778839 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.778825 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.778910 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.778849 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:37.779480 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.779466 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:37.779582 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.779492 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:37.779582 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.779507 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:37.803173 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.803156 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-102.ec2.internal\" not found" node="ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.807329 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.807315 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-102.ec2.internal\" not found" node="ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.814447 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.814434 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 16 22:13:37.829451 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.829433 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/432a54920ff69b032f406403f8e82323-config\") pod \"kube-apiserver-proxy-ip-10-0-129-102.ec2.internal\" (UID: \"432a54920ff69b032f406403f8e82323\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.829513 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.829457 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d00e776913cd1177ab03d04d7041f574-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal\" (UID: \"d00e776913cd1177ab03d04d7041f574\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.829513 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.829474 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d00e776913cd1177ab03d04d7041f574-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal\" (UID: \"d00e776913cd1177ab03d04d7041f574\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.914736 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:37.914706 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 16 22:13:37.930021 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.929967 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/432a54920ff69b032f406403f8e82323-config\") pod \"kube-apiserver-proxy-ip-10-0-129-102.ec2.internal\" (UID: \"432a54920ff69b032f406403f8e82323\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.930021 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.929991 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d00e776913cd1177ab03d04d7041f574-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal\" (UID: \"d00e776913cd1177ab03d04d7041f574\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.930021 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.930008 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d00e776913cd1177ab03d04d7041f574-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal\" (UID: \"d00e776913cd1177ab03d04d7041f574\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.930153 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.930054 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/432a54920ff69b032f406403f8e82323-config\") pod \"kube-apiserver-proxy-ip-10-0-129-102.ec2.internal\" (UID: \"432a54920ff69b032f406403f8e82323\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.930153 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.930100 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d00e776913cd1177ab03d04d7041f574-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal\" (UID: \"d00e776913cd1177ab03d04d7041f574\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 16 22:13:37.930153 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:37.930133 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d00e776913cd1177ab03d04d7041f574-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal\" (UID: \"d00e776913cd1177ab03d04d7041f574\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 16 22:13:38.015350 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:38.015328 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 16 22:13:38.104772 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.104734 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 16 22:13:38.110220 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.110201 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" Apr 16 22:13:38.115923 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:38.115904 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 16 22:13:38.216574 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:38.216479 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 16 22:13:38.316975 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:38.316945 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 16 22:13:38.417642 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:38.417615 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 16 22:13:38.435054 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.435024 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:13:38.435169 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.435154 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:38.518085 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:38.518059 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 16 22:13:38.527419 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.527399 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:38.529924 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.529907 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:38.538749 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.538731 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:38.546824 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.546798 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 22:08:37 +0000 UTC" deadline="2028-01-22 08:39:11.690589466 +0000 UTC" Apr 16 22:13:38.546908 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.546824 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15490h25m33.143768299s" Apr 16 22:13:38.559528 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.559506 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-sxxbq" Apr 16 22:13:38.570467 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.570450 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-sxxbq" Apr 16 22:13:38.573449 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:38.573232 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod432a54920ff69b032f406403f8e82323.slice/crio-4a842ff78c9edd9d89532594a58f474f3a9777ef72b57b06f9aecf35583abc28 WatchSource:0}: Error finding container 4a842ff78c9edd9d89532594a58f474f3a9777ef72b57b06f9aecf35583abc28: Status 404 returned error can't find the container with id 4a842ff78c9edd9d89532594a58f474f3a9777ef72b57b06f9aecf35583abc28 Apr 16 22:13:38.578114 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.578085 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:13:38.580580 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.580564 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:38.628036 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.628019 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 16 22:13:38.643449 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.643435 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:38.644334 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.644323 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" Apr 16 22:13:38.650279 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.650260 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:38.677088 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.677047 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" event={"ID":"432a54920ff69b032f406403f8e82323","Type":"ContainerStarted","Data":"4a842ff78c9edd9d89532594a58f474f3a9777ef72b57b06f9aecf35583abc28"} Apr 16 22:13:38.677978 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.677959 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" event={"ID":"d00e776913cd1177ab03d04d7041f574","Type":"ContainerStarted","Data":"d3e3570459537c6e2cce0a946820fe08f7952eb9f02a015d3d0b8a10f3ea880b"} Apr 16 22:13:38.838523 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:38.838466 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:39.286890 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.286813 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:39.507626 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.507600 2571 apiserver.go:52] "Watching apiserver" Apr 16 22:13:39.513850 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.513826 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:13:39.515922 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.515895 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8trxs","kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal","openshift-cluster-node-tuning-operator/tuned-hfs2p","openshift-multus/multus-additional-cni-plugins-fd5n6","openshift-multus/multus-r6vhd","openshift-multus/network-metrics-daemon-2f4gk","openshift-network-diagnostics/network-check-target-kh55g","openshift-network-operator/iptables-alerter-6rfdq","kube-system/konnectivity-agent-5bljr","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4","openshift-dns/node-resolver-bz2qc","openshift-image-registry/node-ca-4mcb4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal"] Apr 16 22:13:39.519687 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.519295 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:39.519687 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:39.519378 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:13:39.520260 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.520240 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.520373 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.520354 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.521352 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.521337 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:39.521429 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:39.521411 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:13:39.522511 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.522490 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.522835 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.522794 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:13:39.522933 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.522865 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:13:39.522933 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.522917 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2shqw\"" Apr 16 22:13:39.523341 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.523287 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:13:39.523341 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.523302 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:39.523341 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.523322 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:13:39.523541 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.523357 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:13:39.523541 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.523322 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mzrbl\"" Apr 16 22:13:39.523656 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.523643 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:39.523747 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.523729 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6rfdq" Apr 16 22:13:39.524846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.524784 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:13:39.524846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.524815 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:13:39.525844 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.525600 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.525965 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.525895 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:13:39.525965 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.525908 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:13:39.526279 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.526122 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dk4kq\"" Apr 16 22:13:39.526279 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.526127 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:13:39.526279 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.526249 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:39.526544 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.526334 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:13:39.526544 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.526533 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:39.526666 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.526607 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-thbxz\"" Apr 16 22:13:39.526666 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.526623 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:13:39.526836 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.526816 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.527474 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.527445 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:13:39.528344 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.528197 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-p97zh\"" Apr 16 22:13:39.528344 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.528202 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5bljr" Apr 16 22:13:39.528344 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.528261 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:13:39.528533 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.528411 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:13:39.530637 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.529917 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bz2qc" Apr 16 22:13:39.530637 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.530414 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5wclm\"" Apr 16 22:13:39.531154 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.531035 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:13:39.532375 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.532171 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:13:39.533682 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.533658 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:13:39.533863 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.533837 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-r2mbz\"" Apr 16 22:13:39.533949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.533876 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4bxft\"" Apr 16 22:13:39.534040 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.533991 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:13:39.534387 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.534364 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:13:39.534628 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.534612 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4mcb4" Apr 16 22:13:39.537216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537160 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:13:39.537216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537207 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qqddp\"" Apr 16 22:13:39.537348 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537230 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:13:39.537348 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537253 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:13:39.537440 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537384 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7547aebf-2698-4a15-952f-2dc060a10282-iptables-alerter-script\") pod \"iptables-alerter-6rfdq\" (UID: \"7547aebf-2698-4a15-952f-2dc060a10282\") " pod="openshift-network-operator/iptables-alerter-6rfdq" Apr 16 22:13:39.537440 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537412 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.537440 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537428 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7de19038-d2d6-4b61-acee-01b1d7fed4e2-cni-binary-copy\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.537599 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537448 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/401b401e-f58b-4d1a-ac91-0376c9ee48ff-ovn-node-metrics-cert\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.537599 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537465 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-sysconfig\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.537599 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537478 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-tuned\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.537599 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537518 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqn4j\" (UniqueName: \"kubernetes.io/projected/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-kube-api-access-lqn4j\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.537599 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537579 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-var-lib-cni-bin\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.537813 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537608 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-slash\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.537813 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537634 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.537813 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537659 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/401b401e-f58b-4d1a-ac91-0376c9ee48ff-ovnkube-config\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.537813 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537683 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-systemd\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.537813 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537712 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-sys\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.537813 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537757 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/efd08e10-5da2-4f18-afe6-c78ed9bde562-cni-binary-copy\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.537813 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537797 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-system-cni-dir\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.538100 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537830 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/efd08e10-5da2-4f18-afe6-c78ed9bde562-os-release\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.538100 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537865 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-run-netns\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.538100 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537903 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-socket-dir\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.538100 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537935 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-multus-cni-dir\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.538100 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537962 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efd08e10-5da2-4f18-afe6-c78ed9bde562-system-cni-dir\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.538100 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.537993 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/efd08e10-5da2-4f18-afe6-c78ed9bde562-cnibin\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.538100 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/efd08e10-5da2-4f18-afe6-c78ed9bde562-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.538100 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538061 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b6l7\" (UniqueName: \"kubernetes.io/projected/401b401e-f58b-4d1a-ac91-0376c9ee48ff-kube-api-access-9b6l7\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.538100 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538084 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-device-dir\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.538478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538108 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-multus-conf-dir\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.538478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538130 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7eb428a4-211d-4444-991e-d8b3dac28ddf-hosts-file\") pod \"node-resolver-bz2qc\" (UID: \"7eb428a4-211d-4444-991e-d8b3dac28ddf\") " pod="openshift-dns/node-resolver-bz2qc" Apr 16 22:13:39.538478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538174 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/efd08e10-5da2-4f18-afe6-c78ed9bde562-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.538478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538226 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-etc-openvswitch\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.538478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538259 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-modprobe-d\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.538478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538292 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-host\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.538478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538316 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-run-k8s-cni-cncf-io\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.538478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538342 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-os-release\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.538478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538364 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg9gg\" (UniqueName: \"kubernetes.io/projected/efd08e10-5da2-4f18-afe6-c78ed9bde562-kube-api-access-mg9gg\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.538478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538388 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-kubernetes\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.538478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538412 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-cnibin\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.538478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538469 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-var-lib-kubelet\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.538949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538499 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-hostroot\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.538949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538543 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7de19038-d2d6-4b61-acee-01b1d7fed4e2-multus-daemon-config\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.538949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538613 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-run-openvswitch\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.538949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538641 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-node-log\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.538949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538665 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-var-lib-cni-multus\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.538949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538712 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-cni-bin\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.538949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538750 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/401b401e-f58b-4d1a-ac91-0376c9ee48ff-ovnkube-script-lib\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.538949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538776 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-etc-selinux\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.538949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538808 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502-agent-certs\") pod \"konnectivity-agent-5bljr\" (UID: \"eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502\") " pod="kube-system/konnectivity-agent-5bljr" Apr 16 22:13:39.538949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538831 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-sysctl-d\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.538949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538884 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-tmp\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.538949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538921 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-kubelet\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.538949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538946 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-run-systemd\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538968 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/401b401e-f58b-4d1a-ac91-0376c9ee48ff-env-overrides\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.538995 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-sys-fs\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-run\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539038 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-lib-modules\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539067 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7547aebf-2698-4a15-952f-2dc060a10282-host-slash\") pod \"iptables-alerter-6rfdq\" (UID: \"7547aebf-2698-4a15-952f-2dc060a10282\") " pod="openshift-network-operator/iptables-alerter-6rfdq" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539095 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-run-multus-certs\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539148 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-etc-kubernetes\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539187 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5bwp\" (UniqueName: \"kubernetes.io/projected/7de19038-d2d6-4b61-acee-01b1d7fed4e2-kube-api-access-z5bwp\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539219 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539245 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q5kw\" (UniqueName: \"kubernetes.io/projected/7547aebf-2698-4a15-952f-2dc060a10282-kube-api-access-8q5kw\") pod \"iptables-alerter-6rfdq\" (UID: \"7547aebf-2698-4a15-952f-2dc060a10282\") " pod="openshift-network-operator/iptables-alerter-6rfdq" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539270 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7eb428a4-211d-4444-991e-d8b3dac28ddf-tmp-dir\") pod \"node-resolver-bz2qc\" (UID: \"7eb428a4-211d-4444-991e-d8b3dac28ddf\") " pod="openshift-dns/node-resolver-bz2qc" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539304 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glg8w\" (UniqueName: \"kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w\") pod \"network-check-target-kh55g\" (UID: \"c06150de-115f-4d2c-8b2c-ec356592e26f\") " pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539343 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-systemd-units\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539367 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-run-ovn-kubernetes\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539392 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq42x\" (UniqueName: \"kubernetes.io/projected/7eb428a4-211d-4444-991e-d8b3dac28ddf-kube-api-access-hq42x\") pod \"node-resolver-bz2qc\" (UID: \"7eb428a4-211d-4444-991e-d8b3dac28ddf\") " pod="openshift-dns/node-resolver-bz2qc" Apr 16 22:13:39.539506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539416 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/efd08e10-5da2-4f18-afe6-c78ed9bde562-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.540062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539441 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxth5\" (UniqueName: \"kubernetes.io/projected/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-kube-api-access-bxth5\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:39.540062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539463 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-cni-netd\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.540062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539486 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-registration-dir\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.540062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539510 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8cpx\" (UniqueName: \"kubernetes.io/projected/1b0bb086-62f3-4a36-aba2-2986f55f2550-kube-api-access-p8cpx\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.540062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539532 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-var-lib-kubelet\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.540062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539570 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-multus-socket-dir-parent\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.540062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-run-ovn\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.540062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539625 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502-konnectivity-ca\") pod \"konnectivity-agent-5bljr\" (UID: \"eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502\") " pod="kube-system/konnectivity-agent-5bljr" Apr 16 22:13:39.540062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539662 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-run-netns\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.540062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539686 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-var-lib-openvswitch\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.540062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539708 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-sysctl-conf\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.540062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.539751 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-log-socket\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.571070 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.571040 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:38 +0000 UTC" deadline="2028-01-25 17:12:46.598781094 +0000 UTC" Apr 16 22:13:39.571151 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.571070 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15570h59m7.027714337s" Apr 16 22:13:39.629503 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.629478 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:13:39.640757 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.640725 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/efd08e10-5da2-4f18-afe6-c78ed9bde562-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.640891 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.640763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9b6l7\" (UniqueName: \"kubernetes.io/projected/401b401e-f58b-4d1a-ac91-0376c9ee48ff-kube-api-access-9b6l7\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.640891 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.640867 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-device-dir\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.640997 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.640892 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-multus-conf-dir\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.640997 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.640907 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/efd08e10-5da2-4f18-afe6-c78ed9bde562-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.641201 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641176 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-device-dir\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.641274 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641201 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7eb428a4-211d-4444-991e-d8b3dac28ddf-hosts-file\") pod \"node-resolver-bz2qc\" (UID: \"7eb428a4-211d-4444-991e-d8b3dac28ddf\") " pod="openshift-dns/node-resolver-bz2qc" Apr 16 22:13:39.641274 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641250 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/efd08e10-5da2-4f18-afe6-c78ed9bde562-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.641369 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641289 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-etc-openvswitch\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.641369 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641312 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-modprobe-d\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.641369 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641337 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-host\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.641369 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641317 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-multus-conf-dir\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.641567 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641402 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7eb428a4-211d-4444-991e-d8b3dac28ddf-hosts-file\") pod \"node-resolver-bz2qc\" (UID: \"7eb428a4-211d-4444-991e-d8b3dac28ddf\") " pod="openshift-dns/node-resolver-bz2qc" Apr 16 22:13:39.641567 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641406 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-etc-openvswitch\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.641567 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641507 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-run-k8s-cni-cncf-io\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.641710 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641566 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-modprobe-d\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.641710 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641568 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-host\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.641710 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641582 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-os-release\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.641710 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641651 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-run-k8s-cni-cncf-io\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.641883 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641729 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mg9gg\" (UniqueName: \"kubernetes.io/projected/efd08e10-5da2-4f18-afe6-c78ed9bde562-kube-api-access-mg9gg\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.641883 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641770 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-kubernetes\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.641883 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-cnibin\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.641883 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641844 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-os-release\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.642061 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641906 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-cnibin\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.642061 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.641923 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-kubernetes\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.642061 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642018 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-var-lib-kubelet\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.642061 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642054 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-hostroot\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.642243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7de19038-d2d6-4b61-acee-01b1d7fed4e2-multus-daemon-config\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.642243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642126 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-run-openvswitch\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.642243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-node-log\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.642243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642196 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-hostroot\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.642243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642204 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-var-lib-cni-multus\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.642243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642233 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-cni-bin\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.642546 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642255 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-var-lib-kubelet\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.642546 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642259 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/401b401e-f58b-4d1a-ac91-0376c9ee48ff-ovnkube-script-lib\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.642546 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642415 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-etc-selinux\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.642546 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642470 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502-agent-certs\") pod \"konnectivity-agent-5bljr\" (UID: \"eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502\") " pod="kube-system/konnectivity-agent-5bljr" Apr 16 22:13:39.642546 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642502 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-sysctl-d\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.642546 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642533 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-tmp\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.642845 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642573 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-kubelet\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.642845 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-run-systemd\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.642845 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642635 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/401b401e-f58b-4d1a-ac91-0376c9ee48ff-env-overrides\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.642845 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642664 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-sys-fs\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.642845 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642726 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-run\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.642845 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642757 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-lib-modules\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.642845 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642781 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7547aebf-2698-4a15-952f-2dc060a10282-host-slash\") pod \"iptables-alerter-6rfdq\" (UID: \"7547aebf-2698-4a15-952f-2dc060a10282\") " pod="openshift-network-operator/iptables-alerter-6rfdq" Apr 16 22:13:39.642845 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642829 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-run-multus-certs\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.643206 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642859 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-etc-kubernetes\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.643206 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642879 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7de19038-d2d6-4b61-acee-01b1d7fed4e2-multus-daemon-config\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.643206 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5bwp\" (UniqueName: \"kubernetes.io/projected/7de19038-d2d6-4b61-acee-01b1d7fed4e2-kube-api-access-z5bwp\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.643206 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642910 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/401b401e-f58b-4d1a-ac91-0376c9ee48ff-ovnkube-script-lib\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.643206 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.642925 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:39.643206 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-run-openvswitch\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.643206 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643126 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-lib-modules\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.643510 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:39.643249 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:39.643510 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643241 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-var-lib-cni-multus\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.643510 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643312 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q5kw\" (UniqueName: \"kubernetes.io/projected/7547aebf-2698-4a15-952f-2dc060a10282-kube-api-access-8q5kw\") pod \"iptables-alerter-6rfdq\" (UID: \"7547aebf-2698-4a15-952f-2dc060a10282\") " pod="openshift-network-operator/iptables-alerter-6rfdq" Apr 16 22:13:39.643510 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:39.643344 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs podName:e24f7f3c-00b2-43d5-9a49-1b7ee75125a1 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:40.143309505 +0000 UTC m=+3.074634359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs") pod "network-metrics-daemon-2f4gk" (UID: "e24f7f3c-00b2-43d5-9a49-1b7ee75125a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:39.643510 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643383 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-sys-fs\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.643510 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643394 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c62daff-6789-4383-b4d0-6b51a07c06bb-serviceca\") pod \"node-ca-4mcb4\" (UID: \"5c62daff-6789-4383-b4d0-6b51a07c06bb\") " pod="openshift-image-registry/node-ca-4mcb4" Apr 16 22:13:39.643510 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643425 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-node-log\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.643510 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643428 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7eb428a4-211d-4444-991e-d8b3dac28ddf-tmp-dir\") pod \"node-resolver-bz2qc\" (UID: \"7eb428a4-211d-4444-991e-d8b3dac28ddf\") " pod="openshift-dns/node-resolver-bz2qc" Apr 16 22:13:39.643510 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643480 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glg8w\" (UniqueName: \"kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w\") pod \"network-check-target-kh55g\" (UID: \"c06150de-115f-4d2c-8b2c-ec356592e26f\") " pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:39.643510 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643478 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-cni-bin\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.643993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643517 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-systemd-units\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.643993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643564 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-run-ovn-kubernetes\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.643993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643635 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-run-multus-certs\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.643993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643657 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hq42x\" (UniqueName: \"kubernetes.io/projected/7eb428a4-211d-4444-991e-d8b3dac28ddf-kube-api-access-hq42x\") pod \"node-resolver-bz2qc\" (UID: \"7eb428a4-211d-4444-991e-d8b3dac28ddf\") " pod="openshift-dns/node-resolver-bz2qc" Apr 16 22:13:39.643993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643681 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-run\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.643993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643699 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/efd08e10-5da2-4f18-afe6-c78ed9bde562-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.643993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643733 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxth5\" (UniqueName: \"kubernetes.io/projected/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-kube-api-access-bxth5\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:39.643993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643835 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-cni-netd\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.643993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643869 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-registration-dir\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.643993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643892 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/401b401e-f58b-4d1a-ac91-0376c9ee48ff-env-overrides\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.643993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643916 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-systemd-units\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.643993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643938 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-registration-dir\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.643993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.643956 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8cpx\" (UniqueName: \"kubernetes.io/projected/1b0bb086-62f3-4a36-aba2-2986f55f2550-kube-api-access-p8cpx\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.644505 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644007 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-etc-selinux\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.644505 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644093 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-cni-netd\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.644505 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644150 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7547aebf-2698-4a15-952f-2dc060a10282-host-slash\") pod \"iptables-alerter-6rfdq\" (UID: \"7547aebf-2698-4a15-952f-2dc060a10282\") " pod="openshift-network-operator/iptables-alerter-6rfdq" Apr 16 22:13:39.644505 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644189 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-var-lib-kubelet\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.644505 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644225 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cktd\" (UniqueName: \"kubernetes.io/projected/5c62daff-6789-4383-b4d0-6b51a07c06bb-kube-api-access-4cktd\") pod \"node-ca-4mcb4\" (UID: \"5c62daff-6789-4383-b4d0-6b51a07c06bb\") " pod="openshift-image-registry/node-ca-4mcb4" Apr 16 22:13:39.644505 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-multus-socket-dir-parent\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.644505 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644296 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-run-ovn\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.644505 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644346 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502-konnectivity-ca\") pod \"konnectivity-agent-5bljr\" (UID: \"eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502\") " pod="kube-system/konnectivity-agent-5bljr" Apr 16 22:13:39.644505 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-run-netns\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.644505 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644447 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/efd08e10-5da2-4f18-afe6-c78ed9bde562-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.644505 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-var-lib-openvswitch\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644515 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-sysctl-conf\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644542 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c62daff-6789-4383-b4d0-6b51a07c06bb-host\") pod \"node-ca-4mcb4\" (UID: \"5c62daff-6789-4383-b4d0-6b51a07c06bb\") " pod="openshift-image-registry/node-ca-4mcb4" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644544 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-run-ovn-kubernetes\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644590 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-log-socket\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644622 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7547aebf-2698-4a15-952f-2dc060a10282-iptables-alerter-script\") pod \"iptables-alerter-6rfdq\" (UID: \"7547aebf-2698-4a15-952f-2dc060a10282\") " pod="openshift-network-operator/iptables-alerter-6rfdq" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644652 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644692 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7de19038-d2d6-4b61-acee-01b1d7fed4e2-cni-binary-copy\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644719 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/401b401e-f58b-4d1a-ac91-0376c9ee48ff-ovn-node-metrics-cert\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644760 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-sysconfig\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-tuned\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644807 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7eb428a4-211d-4444-991e-d8b3dac28ddf-tmp-dir\") pod \"node-resolver-bz2qc\" (UID: \"7eb428a4-211d-4444-991e-d8b3dac28ddf\") " pod="openshift-dns/node-resolver-bz2qc" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644824 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqn4j\" (UniqueName: \"kubernetes.io/projected/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-kube-api-access-lqn4j\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-var-lib-cni-bin\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644516 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-kubelet\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.645027 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.644951 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:13:39.645657 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.645595 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-var-lib-cni-bin\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.645657 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.645612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/efd08e10-5da2-4f18-afe6-c78ed9bde562-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.645774 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.645740 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-etc-kubernetes\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.645997 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.645974 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-host-run-netns\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.646156 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.646113 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-log-socket\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.646243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.646166 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-sysctl-conf\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.646243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.646175 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-var-lib-kubelet\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.646900 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.646875 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-multus-socket-dir-parent\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.647003 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.646943 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-sysconfig\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.647003 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.646987 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-run-ovn\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.647104 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.646997 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.647508 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647250 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-var-lib-openvswitch\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.647508 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647457 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7de19038-d2d6-4b61-acee-01b1d7fed4e2-cni-binary-copy\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.647508 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647475 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-run-systemd\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.647712 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647600 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502-konnectivity-ca\") pod \"konnectivity-agent-5bljr\" (UID: \"eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502\") " pod="kube-system/konnectivity-agent-5bljr" Apr 16 22:13:39.647712 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647654 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-slash\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.647712 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.647856 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647737 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-slash\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.647856 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.647856 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647776 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/401b401e-f58b-4d1a-ac91-0376c9ee48ff-ovnkube-config\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.647856 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647816 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-systemd\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.647856 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647843 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-sys\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.648086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647868 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/efd08e10-5da2-4f18-afe6-c78ed9bde562-cni-binary-copy\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.648086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-system-cni-dir\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.648086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647897 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7547aebf-2698-4a15-952f-2dc060a10282-iptables-alerter-script\") pod \"iptables-alerter-6rfdq\" (UID: \"7547aebf-2698-4a15-952f-2dc060a10282\") " pod="openshift-network-operator/iptables-alerter-6rfdq" Apr 16 22:13:39.648086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/efd08e10-5da2-4f18-afe6-c78ed9bde562-os-release\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.648086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647943 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-run-netns\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.648086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647967 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-socket-dir\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.648086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647970 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-sys\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.648086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.647991 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-multus-cni-dir\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.648086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648014 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efd08e10-5da2-4f18-afe6-c78ed9bde562-system-cni-dir\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.648086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648016 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-systemd\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.648086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648036 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/efd08e10-5da2-4f18-afe6-c78ed9bde562-cnibin\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.648086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648055 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/401b401e-f58b-4d1a-ac91-0376c9ee48ff-host-run-netns\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.649288 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648100 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/efd08e10-5da2-4f18-afe6-c78ed9bde562-cnibin\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.649288 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648141 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efd08e10-5da2-4f18-afe6-c78ed9bde562-system-cni-dir\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.649288 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648148 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/efd08e10-5da2-4f18-afe6-c78ed9bde562-os-release\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.649288 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648196 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-multus-cni-dir\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.649288 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648237 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1b0bb086-62f3-4a36-aba2-2986f55f2550-socket-dir\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.649288 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648256 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7de19038-d2d6-4b61-acee-01b1d7fed4e2-system-cni-dir\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.649288 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648295 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/401b401e-f58b-4d1a-ac91-0376c9ee48ff-ovnkube-config\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.649288 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648334 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-sysctl-d\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.649288 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648805 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-etc-tuned\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.649288 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.648855 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-tmp\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.649956 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.649828 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502-agent-certs\") pod \"konnectivity-agent-5bljr\" (UID: \"eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502\") " pod="kube-system/konnectivity-agent-5bljr" Apr 16 22:13:39.649956 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.649934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/efd08e10-5da2-4f18-afe6-c78ed9bde562-cni-binary-copy\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.650843 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.650821 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/401b401e-f58b-4d1a-ac91-0376c9ee48ff-ovn-node-metrics-cert\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.651840 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:39.651720 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:39.651840 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:39.651744 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:39.651840 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:39.651771 2571 projected.go:194] Error preparing data for projected volume kube-api-access-glg8w for pod openshift-network-diagnostics/network-check-target-kh55g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:39.652070 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:39.651892 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w podName:c06150de-115f-4d2c-8b2c-ec356592e26f nodeName:}" failed. No retries permitted until 2026-04-16 22:13:40.151861991 +0000 UTC m=+3.083186834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-glg8w" (UniqueName: "kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w") pod "network-check-target-kh55g" (UID: "c06150de-115f-4d2c-8b2c-ec356592e26f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:39.653051 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.653032 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b6l7\" (UniqueName: \"kubernetes.io/projected/401b401e-f58b-4d1a-ac91-0376c9ee48ff-kube-api-access-9b6l7\") pod \"ovnkube-node-8trxs\" (UID: \"401b401e-f58b-4d1a-ac91-0376c9ee48ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.654664 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.654612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5bwp\" (UniqueName: \"kubernetes.io/projected/7de19038-d2d6-4b61-acee-01b1d7fed4e2-kube-api-access-z5bwp\") pod \"multus-r6vhd\" (UID: \"7de19038-d2d6-4b61-acee-01b1d7fed4e2\") " pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.654873 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.654820 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxth5\" (UniqueName: \"kubernetes.io/projected/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-kube-api-access-bxth5\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:39.655258 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.655238 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqn4j\" (UniqueName: \"kubernetes.io/projected/3bd79e7a-3b44-45b8-aefb-daaeaf2abb75-kube-api-access-lqn4j\") pod \"tuned-hfs2p\" (UID: \"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75\") " pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.655333 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.655317 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq42x\" (UniqueName: \"kubernetes.io/projected/7eb428a4-211d-4444-991e-d8b3dac28ddf-kube-api-access-hq42x\") pod \"node-resolver-bz2qc\" (UID: \"7eb428a4-211d-4444-991e-d8b3dac28ddf\") " pod="openshift-dns/node-resolver-bz2qc" Apr 16 22:13:39.655953 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.655926 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q5kw\" (UniqueName: \"kubernetes.io/projected/7547aebf-2698-4a15-952f-2dc060a10282-kube-api-access-8q5kw\") pod \"iptables-alerter-6rfdq\" (UID: \"7547aebf-2698-4a15-952f-2dc060a10282\") " pod="openshift-network-operator/iptables-alerter-6rfdq" Apr 16 22:13:39.656441 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.656420 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg9gg\" (UniqueName: \"kubernetes.io/projected/efd08e10-5da2-4f18-afe6-c78ed9bde562-kube-api-access-mg9gg\") pod \"multus-additional-cni-plugins-fd5n6\" (UID: \"efd08e10-5da2-4f18-afe6-c78ed9bde562\") " pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.656764 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.656741 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8cpx\" (UniqueName: \"kubernetes.io/projected/1b0bb086-62f3-4a36-aba2-2986f55f2550-kube-api-access-p8cpx\") pod \"aws-ebs-csi-driver-node-bbvm4\" (UID: \"1b0bb086-62f3-4a36-aba2-2986f55f2550\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.748473 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.748429 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cktd\" (UniqueName: \"kubernetes.io/projected/5c62daff-6789-4383-b4d0-6b51a07c06bb-kube-api-access-4cktd\") pod \"node-ca-4mcb4\" (UID: \"5c62daff-6789-4383-b4d0-6b51a07c06bb\") " pod="openshift-image-registry/node-ca-4mcb4" Apr 16 22:13:39.748473 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.748476 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c62daff-6789-4383-b4d0-6b51a07c06bb-host\") pod \"node-ca-4mcb4\" (UID: \"5c62daff-6789-4383-b4d0-6b51a07c06bb\") " pod="openshift-image-registry/node-ca-4mcb4" Apr 16 22:13:39.748701 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.748537 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c62daff-6789-4383-b4d0-6b51a07c06bb-host\") pod \"node-ca-4mcb4\" (UID: \"5c62daff-6789-4383-b4d0-6b51a07c06bb\") " pod="openshift-image-registry/node-ca-4mcb4" Apr 16 22:13:39.748701 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.748683 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c62daff-6789-4383-b4d0-6b51a07c06bb-serviceca\") pod \"node-ca-4mcb4\" (UID: \"5c62daff-6789-4383-b4d0-6b51a07c06bb\") " pod="openshift-image-registry/node-ca-4mcb4" Apr 16 22:13:39.749043 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.749022 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c62daff-6789-4383-b4d0-6b51a07c06bb-serviceca\") pod \"node-ca-4mcb4\" (UID: \"5c62daff-6789-4383-b4d0-6b51a07c06bb\") " pod="openshift-image-registry/node-ca-4mcb4" Apr 16 22:13:39.756183 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.756157 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cktd\" (UniqueName: \"kubernetes.io/projected/5c62daff-6789-4383-b4d0-6b51a07c06bb-kube-api-access-4cktd\") pod \"node-ca-4mcb4\" (UID: \"5c62daff-6789-4383-b4d0-6b51a07c06bb\") " pod="openshift-image-registry/node-ca-4mcb4" Apr 16 22:13:39.833588 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.833511 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" Apr 16 22:13:39.840118 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.840099 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fd5n6" Apr 16 22:13:39.847725 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.847704 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:13:39.853332 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.853313 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6rfdq" Apr 16 22:13:39.861897 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.861876 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" Apr 16 22:13:39.867431 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.867412 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-r6vhd" Apr 16 22:13:39.873990 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.873974 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5bljr" Apr 16 22:13:39.880480 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.880464 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bz2qc" Apr 16 22:13:39.885021 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:39.885002 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4mcb4" Apr 16 22:13:40.062612 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:40.062584 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7de19038_d2d6_4b61_acee_01b1d7fed4e2.slice/crio-05b74bb5c7cd5a9dbac1435318715e37c876702115ed066cb6b6e2449b434bc8 WatchSource:0}: Error finding container 05b74bb5c7cd5a9dbac1435318715e37c876702115ed066cb6b6e2449b434bc8: Status 404 returned error can't find the container with id 05b74bb5c7cd5a9dbac1435318715e37c876702115ed066cb6b6e2449b434bc8 Apr 16 22:13:40.063337 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:40.063314 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb3ceb69_c9ef_44b5_bfc3_0c42f2d8d502.slice/crio-974716bb03f24be4aca52e206e21df5e2f92e70c6237f793789fc9ff2c6a84ed WatchSource:0}: Error finding container 974716bb03f24be4aca52e206e21df5e2f92e70c6237f793789fc9ff2c6a84ed: Status 404 returned error can't find the container with id 974716bb03f24be4aca52e206e21df5e2f92e70c6237f793789fc9ff2c6a84ed Apr 16 22:13:40.064098 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:40.064076 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c62daff_6789_4383_b4d0_6b51a07c06bb.slice/crio-c8778ba11ebfcf8f9906ee98e25df4cec144da919ac733ffba43de7aeb1195ab WatchSource:0}: Error finding container c8778ba11ebfcf8f9906ee98e25df4cec144da919ac733ffba43de7aeb1195ab: Status 404 returned error can't find the container with id c8778ba11ebfcf8f9906ee98e25df4cec144da919ac733ffba43de7aeb1195ab Apr 16 22:13:40.066954 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:40.066933 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b0bb086_62f3_4a36_aba2_2986f55f2550.slice/crio-69832aae041240043df9ac8122f128c5cf9b739d44a8366c79689d2e90215d00 WatchSource:0}: Error finding container 69832aae041240043df9ac8122f128c5cf9b739d44a8366c79689d2e90215d00: Status 404 returned error can't find the container with id 69832aae041240043df9ac8122f128c5cf9b739d44a8366c79689d2e90215d00 Apr 16 22:13:40.069026 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:40.068998 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bd79e7a_3b44_45b8_aefb_daaeaf2abb75.slice/crio-810249ab3bcd1b1cefcf91d2b48798d53f4b27f877eaaf93b87e56df01023a24 WatchSource:0}: Error finding container 810249ab3bcd1b1cefcf91d2b48798d53f4b27f877eaaf93b87e56df01023a24: Status 404 returned error can't find the container with id 810249ab3bcd1b1cefcf91d2b48798d53f4b27f877eaaf93b87e56df01023a24 Apr 16 22:13:40.069940 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:40.069923 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7547aebf_2698_4a15_952f_2dc060a10282.slice/crio-cb0066de28f8694e51da5c1d300512539d1e636d9d2cdb5ae5e7cf1d8b5804f8 WatchSource:0}: Error finding container cb0066de28f8694e51da5c1d300512539d1e636d9d2cdb5ae5e7cf1d8b5804f8: Status 404 returned error can't find the container with id cb0066de28f8694e51da5c1d300512539d1e636d9d2cdb5ae5e7cf1d8b5804f8 Apr 16 22:13:40.071986 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:40.071960 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefd08e10_5da2_4f18_afe6_c78ed9bde562.slice/crio-b7189bf5d8955e8fcbf8bcac22cad59171d87e7425c3160acdcc3b3ea3082f5b WatchSource:0}: Error finding container b7189bf5d8955e8fcbf8bcac22cad59171d87e7425c3160acdcc3b3ea3082f5b: Status 404 returned error can't find the container with id b7189bf5d8955e8fcbf8bcac22cad59171d87e7425c3160acdcc3b3ea3082f5b Apr 16 22:13:40.074708 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:40.074601 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod401b401e_f58b_4d1a_ac91_0376c9ee48ff.slice/crio-291d8b75136408ae4910e62127678062a946f2e86c03a13314fbac5aa32f5c7c WatchSource:0}: Error finding container 291d8b75136408ae4910e62127678062a946f2e86c03a13314fbac5aa32f5c7c: Status 404 returned error can't find the container with id 291d8b75136408ae4910e62127678062a946f2e86c03a13314fbac5aa32f5c7c Apr 16 22:13:40.075783 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:13:40.075762 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eb428a4_211d_4444_991e_d8b3dac28ddf.slice/crio-29664119813f2544707c8cd6117acb30537513b5a64d08cec3ecd2c1a80f6862 WatchSource:0}: Error finding container 29664119813f2544707c8cd6117acb30537513b5a64d08cec3ecd2c1a80f6862: Status 404 returned error can't find the container with id 29664119813f2544707c8cd6117acb30537513b5a64d08cec3ecd2c1a80f6862 Apr 16 22:13:40.151301 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.151161 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:40.151301 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:40.151290 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:40.151421 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:40.151340 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs podName:e24f7f3c-00b2-43d5-9a49-1b7ee75125a1 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:41.151326852 +0000 UTC m=+4.082651697 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs") pod "network-metrics-daemon-2f4gk" (UID: "e24f7f3c-00b2-43d5-9a49-1b7ee75125a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:40.252029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.252002 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glg8w\" (UniqueName: \"kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w\") pod \"network-check-target-kh55g\" (UID: \"c06150de-115f-4d2c-8b2c-ec356592e26f\") " pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:40.252202 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:40.252136 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:40.252202 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:40.252154 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:40.252202 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:40.252166 2571 projected.go:194] Error preparing data for projected volume kube-api-access-glg8w for pod openshift-network-diagnostics/network-check-target-kh55g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:40.252350 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:40.252224 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w podName:c06150de-115f-4d2c-8b2c-ec356592e26f nodeName:}" failed. No retries permitted until 2026-04-16 22:13:41.252206116 +0000 UTC m=+4.183530969 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-glg8w" (UniqueName: "kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w") pod "network-check-target-kh55g" (UID: "c06150de-115f-4d2c-8b2c-ec356592e26f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:40.571573 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.571404 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:38 +0000 UTC" deadline="2028-01-18 16:33:04.044329318 +0000 UTC" Apr 16 22:13:40.571573 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.571441 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15402h19m23.472892691s" Apr 16 22:13:40.675390 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.674900 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:40.675390 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:40.675025 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:13:40.689806 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.689118 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" event={"ID":"432a54920ff69b032f406403f8e82323","Type":"ContainerStarted","Data":"b2a305710e16519058a400631a8394f7b8a79357f27290eed669ccc434ede398"} Apr 16 22:13:40.691058 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.691006 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bz2qc" event={"ID":"7eb428a4-211d-4444-991e-d8b3dac28ddf","Type":"ContainerStarted","Data":"29664119813f2544707c8cd6117acb30537513b5a64d08cec3ecd2c1a80f6862"} Apr 16 22:13:40.692201 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.692158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" event={"ID":"401b401e-f58b-4d1a-ac91-0376c9ee48ff","Type":"ContainerStarted","Data":"291d8b75136408ae4910e62127678062a946f2e86c03a13314fbac5aa32f5c7c"} Apr 16 22:13:40.694004 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.693890 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" event={"ID":"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75","Type":"ContainerStarted","Data":"810249ab3bcd1b1cefcf91d2b48798d53f4b27f877eaaf93b87e56df01023a24"} Apr 16 22:13:40.702184 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.702160 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5bljr" event={"ID":"eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502","Type":"ContainerStarted","Data":"974716bb03f24be4aca52e206e21df5e2f92e70c6237f793789fc9ff2c6a84ed"} Apr 16 22:13:40.711932 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.711885 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fd5n6" event={"ID":"efd08e10-5da2-4f18-afe6-c78ed9bde562","Type":"ContainerStarted","Data":"b7189bf5d8955e8fcbf8bcac22cad59171d87e7425c3160acdcc3b3ea3082f5b"} Apr 16 22:13:40.718325 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.717300 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6rfdq" event={"ID":"7547aebf-2698-4a15-952f-2dc060a10282","Type":"ContainerStarted","Data":"cb0066de28f8694e51da5c1d300512539d1e636d9d2cdb5ae5e7cf1d8b5804f8"} Apr 16 22:13:40.728162 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.728105 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" event={"ID":"1b0bb086-62f3-4a36-aba2-2986f55f2550","Type":"ContainerStarted","Data":"69832aae041240043df9ac8122f128c5cf9b739d44a8366c79689d2e90215d00"} Apr 16 22:13:40.732758 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.732727 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4mcb4" event={"ID":"5c62daff-6789-4383-b4d0-6b51a07c06bb","Type":"ContainerStarted","Data":"c8778ba11ebfcf8f9906ee98e25df4cec144da919ac733ffba43de7aeb1195ab"} Apr 16 22:13:40.737593 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:40.737537 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r6vhd" event={"ID":"7de19038-d2d6-4b61-acee-01b1d7fed4e2","Type":"ContainerStarted","Data":"05b74bb5c7cd5a9dbac1435318715e37c876702115ed066cb6b6e2449b434bc8"} Apr 16 22:13:41.165181 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:41.165103 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:41.165334 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:41.165274 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:41.165398 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:41.165336 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs podName:e24f7f3c-00b2-43d5-9a49-1b7ee75125a1 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:43.165316701 +0000 UTC m=+6.096641561 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs") pod "network-metrics-daemon-2f4gk" (UID: "e24f7f3c-00b2-43d5-9a49-1b7ee75125a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:41.266301 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:41.266134 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glg8w\" (UniqueName: \"kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w\") pod \"network-check-target-kh55g\" (UID: \"c06150de-115f-4d2c-8b2c-ec356592e26f\") " pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:41.266301 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:41.266281 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:41.266301 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:41.266301 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:41.266301 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:41.266313 2571 projected.go:194] Error preparing data for projected volume kube-api-access-glg8w for pod openshift-network-diagnostics/network-check-target-kh55g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:41.266638 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:41.266368 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w podName:c06150de-115f-4d2c-8b2c-ec356592e26f nodeName:}" failed. No retries permitted until 2026-04-16 22:13:43.266350833 +0000 UTC m=+6.197675695 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-glg8w" (UniqueName: "kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w") pod "network-check-target-kh55g" (UID: "c06150de-115f-4d2c-8b2c-ec356592e26f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:41.677331 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:41.677299 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:41.677796 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:41.677430 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:13:41.755486 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:41.755443 2571 generic.go:358] "Generic (PLEG): container finished" podID="d00e776913cd1177ab03d04d7041f574" containerID="33f161a50540cc657cf23402057c1181949ca9156bf84193a405d656ef5dead7" exitCode=0 Apr 16 22:13:41.756366 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:41.756333 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" event={"ID":"d00e776913cd1177ab03d04d7041f574","Type":"ContainerDied","Data":"33f161a50540cc657cf23402057c1181949ca9156bf84193a405d656ef5dead7"} Apr 16 22:13:41.771322 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:41.771273 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" podStartSLOduration=3.771254332 podStartE2EDuration="3.771254332s" podCreationTimestamp="2026-04-16 22:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:40.705736439 +0000 UTC m=+3.637061306" watchObservedRunningTime="2026-04-16 22:13:41.771254332 +0000 UTC m=+4.702579195" Apr 16 22:13:42.674973 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:42.674927 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:42.675144 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:42.675054 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:13:42.769350 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:42.769266 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" event={"ID":"d00e776913cd1177ab03d04d7041f574","Type":"ContainerStarted","Data":"2af80f697b38aa880dfd4259108a20f761924a7a9df553b6bbff0e59c6996056"} Apr 16 22:13:42.782813 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:42.782756 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" podStartSLOduration=4.7827405800000005 podStartE2EDuration="4.78274058s" podCreationTimestamp="2026-04-16 22:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:42.78201806 +0000 UTC m=+5.713342923" watchObservedRunningTime="2026-04-16 22:13:42.78274058 +0000 UTC m=+5.714065443" Apr 16 22:13:43.179233 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:43.179146 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:43.179404 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:43.179311 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:43.179404 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:43.179375 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs podName:e24f7f3c-00b2-43d5-9a49-1b7ee75125a1 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:47.179356945 +0000 UTC m=+10.110681799 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs") pod "network-metrics-daemon-2f4gk" (UID: "e24f7f3c-00b2-43d5-9a49-1b7ee75125a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:43.280535 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:43.280498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glg8w\" (UniqueName: \"kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w\") pod \"network-check-target-kh55g\" (UID: \"c06150de-115f-4d2c-8b2c-ec356592e26f\") " pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:43.280706 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:43.280668 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:43.280706 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:43.280688 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:43.280706 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:43.280702 2571 projected.go:194] Error preparing data for projected volume kube-api-access-glg8w for pod openshift-network-diagnostics/network-check-target-kh55g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:43.280875 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:43.280770 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w podName:c06150de-115f-4d2c-8b2c-ec356592e26f nodeName:}" failed. No retries permitted until 2026-04-16 22:13:47.280751986 +0000 UTC m=+10.212076839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-glg8w" (UniqueName: "kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w") pod "network-check-target-kh55g" (UID: "c06150de-115f-4d2c-8b2c-ec356592e26f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:43.675534 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:43.675447 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:43.676012 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:43.675925 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:13:44.674791 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:44.674755 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:44.675255 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:44.674900 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:13:45.675169 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:45.675126 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:45.675640 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:45.675251 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:13:45.984575 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:45.984486 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jng42"] Apr 16 22:13:45.991054 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:45.990688 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:45.991054 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:45.990772 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jng42" podUID="a88aba50-e875-413f-a0d8-0887fad52a8e" Apr 16 22:13:46.105626 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:46.105421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:46.105626 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:46.105485 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a88aba50-e875-413f-a0d8-0887fad52a8e-kubelet-config\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:46.105626 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:46.105518 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a88aba50-e875-413f-a0d8-0887fad52a8e-dbus\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:46.205933 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:46.205843 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:46.205933 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:46.205906 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a88aba50-e875-413f-a0d8-0887fad52a8e-kubelet-config\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:46.205933 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:46.205938 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a88aba50-e875-413f-a0d8-0887fad52a8e-dbus\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:46.206208 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:46.205982 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:46.206208 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:46.206064 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret podName:a88aba50-e875-413f-a0d8-0887fad52a8e nodeName:}" failed. No retries permitted until 2026-04-16 22:13:46.706041218 +0000 UTC m=+9.637366058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret") pod "global-pull-secret-syncer-jng42" (UID: "a88aba50-e875-413f-a0d8-0887fad52a8e") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:46.206208 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:46.206081 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a88aba50-e875-413f-a0d8-0887fad52a8e-kubelet-config\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:46.206208 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:46.206115 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a88aba50-e875-413f-a0d8-0887fad52a8e-dbus\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:46.674941 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:46.674909 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:46.675104 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:46.675044 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:13:46.710665 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:46.710632 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:46.711119 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:46.710796 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:46.711119 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:46.710866 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret podName:a88aba50-e875-413f-a0d8-0887fad52a8e nodeName:}" failed. No retries permitted until 2026-04-16 22:13:47.710845216 +0000 UTC m=+10.642170070 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret") pod "global-pull-secret-syncer-jng42" (UID: "a88aba50-e875-413f-a0d8-0887fad52a8e") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:47.215065 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:47.215025 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:47.215254 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:47.215203 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:47.215310 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:47.215272 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs podName:e24f7f3c-00b2-43d5-9a49-1b7ee75125a1 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:55.215252424 +0000 UTC m=+18.146577271 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs") pod "network-metrics-daemon-2f4gk" (UID: "e24f7f3c-00b2-43d5-9a49-1b7ee75125a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:47.316225 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:47.315633 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glg8w\" (UniqueName: \"kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w\") pod \"network-check-target-kh55g\" (UID: \"c06150de-115f-4d2c-8b2c-ec356592e26f\") " pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:47.316225 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:47.315800 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:47.316225 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:47.315818 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:47.316225 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:47.315831 2571 projected.go:194] Error preparing data for projected volume kube-api-access-glg8w for pod openshift-network-diagnostics/network-check-target-kh55g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:47.316225 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:47.315886 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w podName:c06150de-115f-4d2c-8b2c-ec356592e26f nodeName:}" failed. No retries permitted until 2026-04-16 22:13:55.315868648 +0000 UTC m=+18.247193492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-glg8w" (UniqueName: "kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w") pod "network-check-target-kh55g" (UID: "c06150de-115f-4d2c-8b2c-ec356592e26f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:47.675651 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:47.675164 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:47.675651 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:47.675278 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:13:47.675651 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:47.675454 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:47.675651 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:47.675541 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jng42" podUID="a88aba50-e875-413f-a0d8-0887fad52a8e" Apr 16 22:13:47.718935 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:47.718894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:47.719343 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:47.719054 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:47.719343 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:47.719123 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret podName:a88aba50-e875-413f-a0d8-0887fad52a8e nodeName:}" failed. No retries permitted until 2026-04-16 22:13:49.71910307 +0000 UTC m=+12.650427913 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret") pod "global-pull-secret-syncer-jng42" (UID: "a88aba50-e875-413f-a0d8-0887fad52a8e") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:48.674487 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:48.674447 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:48.674739 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:48.674626 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:13:49.674809 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:49.674774 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:49.674809 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:49.674811 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:49.675260 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:49.674891 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:13:49.675260 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:49.675004 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jng42" podUID="a88aba50-e875-413f-a0d8-0887fad52a8e" Apr 16 22:13:49.733387 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:49.733353 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:49.733574 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:49.733473 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:49.733574 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:49.733526 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret podName:a88aba50-e875-413f-a0d8-0887fad52a8e nodeName:}" failed. No retries permitted until 2026-04-16 22:13:53.733509443 +0000 UTC m=+16.664834298 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret") pod "global-pull-secret-syncer-jng42" (UID: "a88aba50-e875-413f-a0d8-0887fad52a8e") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:50.674730 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:50.674696 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:50.674899 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:50.674820 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:13:51.674135 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:51.674099 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:51.674314 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:51.674217 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jng42" podUID="a88aba50-e875-413f-a0d8-0887fad52a8e" Apr 16 22:13:51.674314 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:51.674274 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:51.674419 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:51.674371 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:13:52.675121 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:52.675049 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:52.675512 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:52.675180 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:13:53.675103 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:53.675070 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:53.675319 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:53.675186 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jng42" podUID="a88aba50-e875-413f-a0d8-0887fad52a8e" Apr 16 22:13:53.675319 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:53.675244 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:53.675706 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:53.675325 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:13:53.762946 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:53.762909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:53.763190 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:53.763083 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:53.763190 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:53.763162 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret podName:a88aba50-e875-413f-a0d8-0887fad52a8e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.763144267 +0000 UTC m=+24.694469107 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret") pod "global-pull-secret-syncer-jng42" (UID: "a88aba50-e875-413f-a0d8-0887fad52a8e") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:54.674623 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:54.674591 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:54.674874 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:54.674710 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:13:55.274950 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:55.274907 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:55.275370 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:55.275054 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:55.275370 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:55.275127 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs podName:e24f7f3c-00b2-43d5-9a49-1b7ee75125a1 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:11.275108883 +0000 UTC m=+34.206433743 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs") pod "network-metrics-daemon-2f4gk" (UID: "e24f7f3c-00b2-43d5-9a49-1b7ee75125a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:55.375407 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:55.375371 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glg8w\" (UniqueName: \"kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w\") pod \"network-check-target-kh55g\" (UID: \"c06150de-115f-4d2c-8b2c-ec356592e26f\") " pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:55.375610 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:55.375526 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:55.375610 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:55.375560 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:55.375610 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:55.375573 2571 projected.go:194] Error preparing data for projected volume kube-api-access-glg8w for pod openshift-network-diagnostics/network-check-target-kh55g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:55.375775 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:55.375625 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w podName:c06150de-115f-4d2c-8b2c-ec356592e26f nodeName:}" failed. No retries permitted until 2026-04-16 22:14:11.37561265 +0000 UTC m=+34.306937494 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-glg8w" (UniqueName: "kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w") pod "network-check-target-kh55g" (UID: "c06150de-115f-4d2c-8b2c-ec356592e26f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:55.675164 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:55.675071 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:55.675354 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:55.675077 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:55.675354 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:55.675215 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jng42" podUID="a88aba50-e875-413f-a0d8-0887fad52a8e" Apr 16 22:13:55.675354 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:55.675290 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:13:56.674891 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:56.674863 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:56.675240 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:56.674969 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:13:57.675276 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.674989 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:57.675977 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:57.675330 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jng42" podUID="a88aba50-e875-413f-a0d8-0887fad52a8e" Apr 16 22:13:57.675977 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.675051 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:57.675977 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:57.675466 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:13:57.792953 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.792923 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bz2qc" event={"ID":"7eb428a4-211d-4444-991e-d8b3dac28ddf","Type":"ContainerStarted","Data":"0400e623d459919358c468331235715e5a0c4825fbfdf3356c78b83c69083cb3"} Apr 16 22:13:57.795575 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.795537 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" event={"ID":"401b401e-f58b-4d1a-ac91-0376c9ee48ff","Type":"ContainerStarted","Data":"d1a5067d6d9985bb264d2f2c640b8fdd2e823d4ea387279c288cf55de34865e7"} Apr 16 22:13:57.795695 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.795584 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" event={"ID":"401b401e-f58b-4d1a-ac91-0376c9ee48ff","Type":"ContainerStarted","Data":"9f6fa30058370aad1b2aa329ace617453a91e94a0cb85fdf16d7c36bc4305ae3"} Apr 16 22:13:57.795695 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.795599 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" event={"ID":"401b401e-f58b-4d1a-ac91-0376c9ee48ff","Type":"ContainerStarted","Data":"2d0974135121d095887588d14a0f71d3f8291bf489fb4245ebcf723955d11cad"} Apr 16 22:13:57.795695 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.795612 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" event={"ID":"401b401e-f58b-4d1a-ac91-0376c9ee48ff","Type":"ContainerStarted","Data":"a19035c14fa56ec80dec811d6ac6c1ddd541407a44c22089d97f03560bc05fc7"} Apr 16 22:13:57.795695 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.795624 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" event={"ID":"401b401e-f58b-4d1a-ac91-0376c9ee48ff","Type":"ContainerStarted","Data":"fca95efa8a5582d1cb5682aac5bc86269e20300b67792a88b8630c42568b895a"} Apr 16 22:13:57.795695 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.795637 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" event={"ID":"401b401e-f58b-4d1a-ac91-0376c9ee48ff","Type":"ContainerStarted","Data":"55d0f40f976831f230fbef2b9e7be8608b845381b5f20893fd2f5340383093b8"} Apr 16 22:13:57.797184 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.797159 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" event={"ID":"3bd79e7a-3b44-45b8-aefb-daaeaf2abb75","Type":"ContainerStarted","Data":"47b8d532781cbbadde73560c393e3635c6cb3847aeef4a71b596241566e20efe"} Apr 16 22:13:57.798354 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.798331 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5bljr" event={"ID":"eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502","Type":"ContainerStarted","Data":"f61a1c2401bff2c427dc5b90cb3517d1fc3268fd3a634680569bf4d149ec360b"} Apr 16 22:13:57.799863 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.799840 2571 generic.go:358] "Generic (PLEG): container finished" podID="efd08e10-5da2-4f18-afe6-c78ed9bde562" containerID="12f55c22dcb4ec20d72cd323d669d16153564fbfbeef898691c396c0d7dbeecb" exitCode=0 Apr 16 22:13:57.799944 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.799912 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fd5n6" event={"ID":"efd08e10-5da2-4f18-afe6-c78ed9bde562","Type":"ContainerDied","Data":"12f55c22dcb4ec20d72cd323d669d16153564fbfbeef898691c396c0d7dbeecb"} Apr 16 22:13:57.802059 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.802017 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" event={"ID":"1b0bb086-62f3-4a36-aba2-2986f55f2550","Type":"ContainerStarted","Data":"378b38cb54bb4afae27fc3eefde36d853b95a85bb7fe7baf8fda873cd823e7f4"} Apr 16 22:13:57.802059 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.802050 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:13:57.803224 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.803204 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4mcb4" event={"ID":"5c62daff-6789-4383-b4d0-6b51a07c06bb","Type":"ContainerStarted","Data":"a6d2df28d94a437dd730216b78993b73a75a6be44d8f49e0255faae0da8cd06a"} Apr 16 22:13:57.805434 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.805414 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-5bljr" Apr 16 22:13:57.805966 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.805948 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-5bljr" Apr 16 22:13:57.806217 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.806173 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bz2qc" podStartSLOduration=3.176779198 podStartE2EDuration="19.806157058s" podCreationTimestamp="2026-04-16 22:13:38 +0000 UTC" firstStartedPulling="2026-04-16 22:13:40.077523381 +0000 UTC m=+3.008848221" lastFinishedPulling="2026-04-16 22:13:56.70690124 +0000 UTC m=+19.638226081" observedRunningTime="2026-04-16 22:13:57.805693313 +0000 UTC m=+20.737018187" watchObservedRunningTime="2026-04-16 22:13:57.806157058 +0000 UTC m=+20.737481921" Apr 16 22:13:57.807846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.807826 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-r6vhd" event={"ID":"7de19038-d2d6-4b61-acee-01b1d7fed4e2","Type":"ContainerStarted","Data":"01855d3e0a4bf89f0af0ff379494477902eae64e64077a0f70f5a55631658309"} Apr 16 22:13:57.817639 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.817607 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-5bljr" podStartSLOduration=3.176563742 podStartE2EDuration="19.817597845s" podCreationTimestamp="2026-04-16 22:13:38 +0000 UTC" firstStartedPulling="2026-04-16 22:13:40.065859026 +0000 UTC m=+2.997183869" lastFinishedPulling="2026-04-16 22:13:56.706893113 +0000 UTC m=+19.638217972" observedRunningTime="2026-04-16 22:13:57.817155752 +0000 UTC m=+20.748480605" watchObservedRunningTime="2026-04-16 22:13:57.817597845 +0000 UTC m=+20.748922706" Apr 16 22:13:57.841236 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.841202 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4mcb4" podStartSLOduration=3.201038792 podStartE2EDuration="19.841192894s" podCreationTimestamp="2026-04-16 22:13:38 +0000 UTC" firstStartedPulling="2026-04-16 22:13:40.066781464 +0000 UTC m=+2.998106318" lastFinishedPulling="2026-04-16 22:13:56.70693557 +0000 UTC m=+19.638260420" observedRunningTime="2026-04-16 22:13:57.828291911 +0000 UTC m=+20.759616773" watchObservedRunningTime="2026-04-16 22:13:57.841192894 +0000 UTC m=+20.772517755" Apr 16 22:13:57.841426 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.841408 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hfs2p" podStartSLOduration=4.207181415 podStartE2EDuration="20.841404463s" podCreationTimestamp="2026-04-16 22:13:37 +0000 UTC" firstStartedPulling="2026-04-16 22:13:40.072714927 +0000 UTC m=+3.004039768" lastFinishedPulling="2026-04-16 22:13:56.706937971 +0000 UTC m=+19.638262816" observedRunningTime="2026-04-16 22:13:57.841089377 +0000 UTC m=+20.772414240" watchObservedRunningTime="2026-04-16 22:13:57.841404463 +0000 UTC m=+20.772729397" Apr 16 22:13:57.912970 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:57.912928 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-r6vhd" podStartSLOduration=3.236959379 podStartE2EDuration="19.91291854s" podCreationTimestamp="2026-04-16 22:13:38 +0000 UTC" firstStartedPulling="2026-04-16 22:13:40.065343463 +0000 UTC m=+2.996668317" lastFinishedPulling="2026-04-16 22:13:56.741302635 +0000 UTC m=+19.672627478" observedRunningTime="2026-04-16 22:13:57.887764068 +0000 UTC m=+20.819088929" watchObservedRunningTime="2026-04-16 22:13:57.91291854 +0000 UTC m=+20.844243401" Apr 16 22:13:58.601915 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:58.601797 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:13:57.802062384Z","UUID":"f6acef05-8cee-4c02-95f1-a82174bf32e4","Handler":null,"Name":"","Endpoint":""} Apr 16 22:13:58.603924 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:58.603898 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:13:58.603924 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:58.603929 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:13:58.674818 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:58.674789 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:13:58.674938 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:58.674916 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:13:58.811204 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:58.811116 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6rfdq" event={"ID":"7547aebf-2698-4a15-952f-2dc060a10282","Type":"ContainerStarted","Data":"fd421b8bac80b1b5109bd081e202a9007da7792f945c0dac2899164b61713815"} Apr 16 22:13:58.813263 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:58.813204 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" event={"ID":"1b0bb086-62f3-4a36-aba2-2986f55f2550","Type":"ContainerStarted","Data":"380f7797e0a5011064d8d16f372ea024c2321ff0f6775c32e1b5cd36db12810f"} Apr 16 22:13:58.813263 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:58.813241 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" event={"ID":"1b0bb086-62f3-4a36-aba2-2986f55f2550","Type":"ContainerStarted","Data":"2752bb415890a696d79d900dd7520b7c018c65d55b7e305d9676ac7f4c81d014"} Apr 16 22:13:58.824956 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:58.824909 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-6rfdq" podStartSLOduration=5.191932311 podStartE2EDuration="21.824897089s" podCreationTimestamp="2026-04-16 22:13:37 +0000 UTC" firstStartedPulling="2026-04-16 22:13:40.074168314 +0000 UTC m=+3.005493168" lastFinishedPulling="2026-04-16 22:13:56.707133092 +0000 UTC m=+19.638457946" observedRunningTime="2026-04-16 22:13:58.824285258 +0000 UTC m=+21.755610123" watchObservedRunningTime="2026-04-16 22:13:58.824897089 +0000 UTC m=+21.756221951" Apr 16 22:13:58.838911 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:58.838863 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbvm4" podStartSLOduration=3.367602448 podStartE2EDuration="21.83885098s" podCreationTimestamp="2026-04-16 22:13:37 +0000 UTC" firstStartedPulling="2026-04-16 22:13:40.068566666 +0000 UTC m=+2.999891519" lastFinishedPulling="2026-04-16 22:13:58.539815194 +0000 UTC m=+21.471140051" observedRunningTime="2026-04-16 22:13:58.838814787 +0000 UTC m=+21.770139650" watchObservedRunningTime="2026-04-16 22:13:58.83885098 +0000 UTC m=+21.770175846" Apr 16 22:13:59.674977 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:59.674945 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:13:59.675146 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:59.675065 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jng42" podUID="a88aba50-e875-413f-a0d8-0887fad52a8e" Apr 16 22:13:59.675213 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:59.675144 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:13:59.675274 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:13:59.675247 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:13:59.818640 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:59.818541 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" event={"ID":"401b401e-f58b-4d1a-ac91-0376c9ee48ff","Type":"ContainerStarted","Data":"86e9ddc941c4723184e891819fa91dd25e3a9cc91e3300980efbf9383b5a0917"} Apr 16 22:13:59.819038 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:13:59.818639 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 22:14:00.674311 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:00.674111 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:14:00.674502 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:00.674415 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:14:01.674212 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:01.674185 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:14:01.674980 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:01.674305 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:14:01.674980 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:01.674312 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jng42" podUID="a88aba50-e875-413f-a0d8-0887fad52a8e" Apr 16 22:14:01.674980 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:01.674412 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:14:01.824871 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:01.824840 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" event={"ID":"401b401e-f58b-4d1a-ac91-0376c9ee48ff","Type":"ContainerStarted","Data":"fa8dda5650d0d4fdd873f242621fa9d8a1a9681f3cf9609d7c4f98dd93faf8b5"} Apr 16 22:14:01.825236 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:01.825195 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:14:01.825236 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:01.825225 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:14:01.825377 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:01.825232 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:14:01.825377 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:01.825370 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:14:01.825493 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:01.825443 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret podName:a88aba50-e875-413f-a0d8-0887fad52a8e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:17.825423482 +0000 UTC m=+40.756748329 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret") pod "global-pull-secret-syncer-jng42" (UID: "a88aba50-e875-413f-a0d8-0887fad52a8e") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:14:01.841326 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:01.841298 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:14:01.872742 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:01.872481 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" podStartSLOduration=7.9478125219999995 podStartE2EDuration="24.872464067s" podCreationTimestamp="2026-04-16 22:13:37 +0000 UTC" firstStartedPulling="2026-04-16 22:13:40.07682934 +0000 UTC m=+3.008154179" lastFinishedPulling="2026-04-16 22:13:57.00148087 +0000 UTC m=+19.932805724" observedRunningTime="2026-04-16 22:14:01.872331839 +0000 UTC m=+24.803656747" watchObservedRunningTime="2026-04-16 22:14:01.872464067 +0000 UTC m=+24.803788928" Apr 16 22:14:02.674784 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:02.674735 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:14:02.675195 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:02.674859 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:14:02.827563 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:02.827524 2571 generic.go:358] "Generic (PLEG): container finished" podID="efd08e10-5da2-4f18-afe6-c78ed9bde562" containerID="cb84e69cfa609cb5c138b2f6de350cb5e7d9aa78b1ccc79766bef03b18a031ea" exitCode=0 Apr 16 22:14:02.827734 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:02.827588 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fd5n6" event={"ID":"efd08e10-5da2-4f18-afe6-c78ed9bde562","Type":"ContainerDied","Data":"cb84e69cfa609cb5c138b2f6de350cb5e7d9aa78b1ccc79766bef03b18a031ea"} Apr 16 22:14:02.828849 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:02.828204 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:14:02.841702 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:02.841682 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:14:03.192804 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:03.192728 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-5bljr" Apr 16 22:14:03.193009 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:03.192892 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 22:14:03.193307 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:03.193292 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-5bljr" Apr 16 22:14:03.630347 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:03.630073 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2f4gk"] Apr 16 22:14:03.630347 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:03.630291 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:14:03.630582 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:03.630520 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:14:03.631486 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:03.631459 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jng42"] Apr 16 22:14:03.631643 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:03.631607 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:14:03.631730 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:03.631703 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jng42" podUID="a88aba50-e875-413f-a0d8-0887fad52a8e" Apr 16 22:14:03.632022 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:03.632002 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kh55g"] Apr 16 22:14:03.632109 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:03.632092 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:14:03.632225 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:03.632189 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:14:03.831846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:03.831814 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fd5n6" event={"ID":"efd08e10-5da2-4f18-afe6-c78ed9bde562","Type":"ContainerStarted","Data":"0f91bf33ec56ffc3a882bc767831fb10e435787ae0f8beb2c83ed8273f6a7bf7"} Apr 16 22:14:04.674437 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:04.674405 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:14:04.674632 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:04.674511 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:14:04.835186 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:04.835153 2571 generic.go:358] "Generic (PLEG): container finished" podID="efd08e10-5da2-4f18-afe6-c78ed9bde562" containerID="0f91bf33ec56ffc3a882bc767831fb10e435787ae0f8beb2c83ed8273f6a7bf7" exitCode=0 Apr 16 22:14:04.835653 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:04.835253 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fd5n6" event={"ID":"efd08e10-5da2-4f18-afe6-c78ed9bde562","Type":"ContainerDied","Data":"0f91bf33ec56ffc3a882bc767831fb10e435787ae0f8beb2c83ed8273f6a7bf7"} Apr 16 22:14:05.674583 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:05.674561 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:14:05.674684 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:05.674565 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:14:05.674728 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:05.674675 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:14:05.674795 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:05.674774 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jng42" podUID="a88aba50-e875-413f-a0d8-0887fad52a8e" Apr 16 22:14:05.839426 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:05.839345 2571 generic.go:358] "Generic (PLEG): container finished" podID="efd08e10-5da2-4f18-afe6-c78ed9bde562" containerID="df2d0f39d063eb941b15f93f9a98e44464e63d1d3e7592a25caa0e18953c6cb4" exitCode=0 Apr 16 22:14:05.839426 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:05.839411 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fd5n6" event={"ID":"efd08e10-5da2-4f18-afe6-c78ed9bde562","Type":"ContainerDied","Data":"df2d0f39d063eb941b15f93f9a98e44464e63d1d3e7592a25caa0e18953c6cb4"} Apr 16 22:14:06.674545 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:06.674515 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:14:06.674706 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:06.674664 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:14:07.675275 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:07.675235 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:14:07.675726 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:07.675335 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:14:07.675726 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:07.675371 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:14:07.675726 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:07.675466 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jng42" podUID="a88aba50-e875-413f-a0d8-0887fad52a8e" Apr 16 22:14:08.674719 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:08.674637 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:14:08.674997 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:08.674763 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:14:09.674980 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.674939 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:14:09.674980 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.674962 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:14:09.675502 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:09.675078 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jng42" podUID="a88aba50-e875-413f-a0d8-0887fad52a8e" Apr 16 22:14:09.675502 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:09.675195 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kh55g" podUID="c06150de-115f-4d2c-8b2c-ec356592e26f" Apr 16 22:14:09.910537 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.910508 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeReady" Apr 16 22:14:09.910725 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.910656 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:14:09.947813 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.947771 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-76bb79884b-57jt7"] Apr 16 22:14:09.977104 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.977025 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wcg8z"] Apr 16 22:14:09.977265 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.977178 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:09.979211 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.979190 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76bb79884b-57jt7"] Apr 16 22:14:09.979330 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.979219 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gv9kf"] Apr 16 22:14:09.979386 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.979338 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:09.981008 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.980988 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wcg8z"] Apr 16 22:14:09.981097 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.981087 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:14:09.981979 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.981794 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:14:09.982367 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.982346 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:14:09.982484 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.982460 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:14:09.982941 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.982641 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-48hdh\"" Apr 16 22:14:09.982941 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.982765 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:14:09.982941 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.982792 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fk2j6\"" Apr 16 22:14:09.982941 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.982856 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gv9kf"] Apr 16 22:14:09.982941 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.982869 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:14:09.984161 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.984022 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:14:09.984161 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.984056 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:14:09.985130 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.984464 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:14:09.985240 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.985224 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-86pmr\"" Apr 16 22:14:09.988699 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:09.988652 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:14:10.087193 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087165 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw8vv\" (UniqueName: \"kubernetes.io/projected/011ad47b-a64a-4697-8f37-02cbc931d548-kube-api-access-vw8vv\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:10.087363 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087208 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b090c0b3-373b-4083-99b5-0851f1e3c94b-installation-pull-secrets\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.087363 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087279 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/011ad47b-a64a-4697-8f37-02cbc931d548-tmp-dir\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:10.087363 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087304 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnpsm\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-kube-api-access-gnpsm\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.087363 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087359 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/011ad47b-a64a-4697-8f37-02cbc931d548-config-volume\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:10.087578 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087377 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b090c0b3-373b-4083-99b5-0851f1e3c94b-trusted-ca\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.087578 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087403 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-bound-sa-token\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.087578 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:10.087578 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087458 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b090c0b3-373b-4083-99b5-0851f1e3c94b-image-registry-private-configuration\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.087777 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087603 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.087777 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087648 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b090c0b3-373b-4083-99b5-0851f1e3c94b-ca-trust-extracted\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.087777 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087675 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:14:10.087777 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087702 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5k7n\" (UniqueName: \"kubernetes.io/projected/9b954b07-4736-4bf5-a073-457f98c06525-kube-api-access-t5k7n\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:14:10.087777 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.087733 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-certificates\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.189048 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.188970 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b090c0b3-373b-4083-99b5-0851f1e3c94b-installation-pull-secrets\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.189048 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.189013 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/011ad47b-a64a-4697-8f37-02cbc931d548-tmp-dir\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:10.189048 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.189041 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnpsm\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-kube-api-access-gnpsm\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.189327 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.189065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/011ad47b-a64a-4697-8f37-02cbc931d548-config-volume\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:10.189327 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.189086 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b090c0b3-373b-4083-99b5-0851f1e3c94b-trusted-ca\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.189327 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.189105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-bound-sa-token\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.189327 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.189137 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:10.189327 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.189164 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b090c0b3-373b-4083-99b5-0851f1e3c94b-image-registry-private-configuration\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.189327 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.189225 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.189327 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.189266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b090c0b3-373b-4083-99b5-0851f1e3c94b-ca-trust-extracted\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.189327 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.189297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:14:10.189734 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.189345 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5k7n\" (UniqueName: \"kubernetes.io/projected/9b954b07-4736-4bf5-a073-457f98c06525-kube-api-access-t5k7n\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:14:10.189734 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.189354 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:10.189734 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.189392 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-certificates\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.189734 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.189454 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls podName:011ad47b-a64a-4697-8f37-02cbc931d548 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:10.68942699 +0000 UTC m=+33.620751831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls") pod "dns-default-wcg8z" (UID: "011ad47b-a64a-4697-8f37-02cbc931d548") : secret "dns-default-metrics-tls" not found Apr 16 22:14:10.189734 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.189501 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw8vv\" (UniqueName: \"kubernetes.io/projected/011ad47b-a64a-4697-8f37-02cbc931d548-kube-api-access-vw8vv\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:10.190671 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.190637 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-certificates\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.190800 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.190715 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/011ad47b-a64a-4697-8f37-02cbc931d548-config-volume\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:10.190891 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.190834 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b090c0b3-373b-4083-99b5-0851f1e3c94b-ca-trust-extracted\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.190891 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.190848 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/011ad47b-a64a-4697-8f37-02cbc931d548-tmp-dir\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:10.191033 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.190965 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:10.191033 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.191025 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert podName:9b954b07-4736-4bf5-a073-457f98c06525 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:10.691005958 +0000 UTC m=+33.622330806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert") pod "ingress-canary-gv9kf" (UID: "9b954b07-4736-4bf5-a073-457f98c06525") : secret "canary-serving-cert" not found Apr 16 22:14:10.191145 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.191051 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b090c0b3-373b-4083-99b5-0851f1e3c94b-trusted-ca\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.191245 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.191225 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:10.191299 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.191249 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76bb79884b-57jt7: secret "image-registry-tls" not found Apr 16 22:14:10.191349 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.191303 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls podName:b090c0b3-373b-4083-99b5-0851f1e3c94b nodeName:}" failed. No retries permitted until 2026-04-16 22:14:10.691286201 +0000 UTC m=+33.622611055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls") pod "image-registry-76bb79884b-57jt7" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b") : secret "image-registry-tls" not found Apr 16 22:14:10.195013 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.194990 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b090c0b3-373b-4083-99b5-0851f1e3c94b-installation-pull-secrets\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.196317 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.196292 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b090c0b3-373b-4083-99b5-0851f1e3c94b-image-registry-private-configuration\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.198622 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.198589 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-bound-sa-token\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.199454 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.199430 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw8vv\" (UniqueName: \"kubernetes.io/projected/011ad47b-a64a-4697-8f37-02cbc931d548-kube-api-access-vw8vv\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:10.200358 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.200334 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5k7n\" (UniqueName: \"kubernetes.io/projected/9b954b07-4736-4bf5-a073-457f98c06525-kube-api-access-t5k7n\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:14:10.200532 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.200511 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnpsm\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-kube-api-access-gnpsm\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.674890 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.674851 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:14:10.677840 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.677818 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:10.677840 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.677826 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7dlkb\"" Apr 16 22:14:10.693240 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.693215 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:10.693348 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.693256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:14:10.693348 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:10.693322 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:10.693442 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.693361 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:10.693442 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.693375 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76bb79884b-57jt7: secret "image-registry-tls" not found Apr 16 22:14:10.693442 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.693407 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:10.693442 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.693436 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:10.693608 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.693447 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls podName:b090c0b3-373b-4083-99b5-0851f1e3c94b nodeName:}" failed. No retries permitted until 2026-04-16 22:14:11.693427089 +0000 UTC m=+34.624751946 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls") pod "image-registry-76bb79884b-57jt7" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b") : secret "image-registry-tls" not found Apr 16 22:14:10.693608 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.693484 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert podName:9b954b07-4736-4bf5-a073-457f98c06525 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:11.693471842 +0000 UTC m=+34.624796687 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert") pod "ingress-canary-gv9kf" (UID: "9b954b07-4736-4bf5-a073-457f98c06525") : secret "canary-serving-cert" not found Apr 16 22:14:10.693608 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:10.693502 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls podName:011ad47b-a64a-4697-8f37-02cbc931d548 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:11.693493019 +0000 UTC m=+34.624817861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls") pod "dns-default-wcg8z" (UID: "011ad47b-a64a-4697-8f37-02cbc931d548") : secret "dns-default-metrics-tls" not found Apr 16 22:14:11.299981 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:11.299947 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:14:11.300136 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:11.300095 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:11.300183 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:11.300158 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs podName:e24f7f3c-00b2-43d5-9a49-1b7ee75125a1 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:43.300141814 +0000 UTC m=+66.231466659 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs") pod "network-metrics-daemon-2f4gk" (UID: "e24f7f3c-00b2-43d5-9a49-1b7ee75125a1") : secret "metrics-daemon-secret" not found Apr 16 22:14:11.400887 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:11.400844 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glg8w\" (UniqueName: \"kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w\") pod \"network-check-target-kh55g\" (UID: \"c06150de-115f-4d2c-8b2c-ec356592e26f\") " pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:14:11.401046 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:11.401007 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:14:11.401046 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:11.401027 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:14:11.401046 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:11.401038 2571 projected.go:194] Error preparing data for projected volume kube-api-access-glg8w for pod openshift-network-diagnostics/network-check-target-kh55g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:11.401164 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:11.401086 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w podName:c06150de-115f-4d2c-8b2c-ec356592e26f nodeName:}" failed. No retries permitted until 2026-04-16 22:14:43.401072648 +0000 UTC m=+66.332397488 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-glg8w" (UniqueName: "kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w") pod "network-check-target-kh55g" (UID: "c06150de-115f-4d2c-8b2c-ec356592e26f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:11.674512 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:11.674488 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:14:11.674713 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:11.674696 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:14:11.678611 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:11.678590 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:11.678947 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:11.678620 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:11.678947 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:11.678636 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-2r56c\"" Apr 16 22:14:11.678947 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:11.678626 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:14:11.703068 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:11.703045 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:11.703175 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:11.703081 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:14:11.703175 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:11.703136 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:11.703296 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:11.703210 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:11.703296 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:11.703230 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76bb79884b-57jt7: secret "image-registry-tls" not found Apr 16 22:14:11.703296 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:11.703235 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:11.703296 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:11.703246 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:11.703296 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:11.703289 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls podName:b090c0b3-373b-4083-99b5-0851f1e3c94b nodeName:}" failed. No retries permitted until 2026-04-16 22:14:13.703270428 +0000 UTC m=+36.634595276 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls") pod "image-registry-76bb79884b-57jt7" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b") : secret "image-registry-tls" not found Apr 16 22:14:11.703566 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:11.703307 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert podName:9b954b07-4736-4bf5-a073-457f98c06525 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:13.703298387 +0000 UTC m=+36.634623227 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert") pod "ingress-canary-gv9kf" (UID: "9b954b07-4736-4bf5-a073-457f98c06525") : secret "canary-serving-cert" not found Apr 16 22:14:11.703566 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:11.703322 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls podName:011ad47b-a64a-4697-8f37-02cbc931d548 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:13.703313134 +0000 UTC m=+36.634637979 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls") pod "dns-default-wcg8z" (UID: "011ad47b-a64a-4697-8f37-02cbc931d548") : secret "dns-default-metrics-tls" not found Apr 16 22:14:11.854984 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:11.854951 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fd5n6" event={"ID":"efd08e10-5da2-4f18-afe6-c78ed9bde562","Type":"ContainerStarted","Data":"d8c920f8cecb609ab66f031fb5afe5a1f904f1d3cca602fcd9009ef2328bdb4b"} Apr 16 22:14:12.859337 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:12.859307 2571 generic.go:358] "Generic (PLEG): container finished" podID="efd08e10-5da2-4f18-afe6-c78ed9bde562" containerID="d8c920f8cecb609ab66f031fb5afe5a1f904f1d3cca602fcd9009ef2328bdb4b" exitCode=0 Apr 16 22:14:12.859785 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:12.859356 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fd5n6" event={"ID":"efd08e10-5da2-4f18-afe6-c78ed9bde562","Type":"ContainerDied","Data":"d8c920f8cecb609ab66f031fb5afe5a1f904f1d3cca602fcd9009ef2328bdb4b"} Apr 16 22:14:13.720185 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:13.720144 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:13.720365 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:13.720206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:13.720365 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:13.720237 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:14:13.720365 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:13.720321 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:13.720365 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:13.720332 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76bb79884b-57jt7: secret "image-registry-tls" not found Apr 16 22:14:13.720365 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:13.720329 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:13.720579 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:13.720382 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls podName:b090c0b3-373b-4083-99b5-0851f1e3c94b nodeName:}" failed. No retries permitted until 2026-04-16 22:14:17.720365955 +0000 UTC m=+40.651690799 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls") pod "image-registry-76bb79884b-57jt7" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b") : secret "image-registry-tls" not found Apr 16 22:14:13.720579 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:13.720399 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls podName:011ad47b-a64a-4697-8f37-02cbc931d548 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:17.720390585 +0000 UTC m=+40.651715426 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls") pod "dns-default-wcg8z" (UID: "011ad47b-a64a-4697-8f37-02cbc931d548") : secret "dns-default-metrics-tls" not found Apr 16 22:14:13.720579 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:13.720427 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:13.720579 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:13.720481 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert podName:9b954b07-4736-4bf5-a073-457f98c06525 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:17.72046798 +0000 UTC m=+40.651792834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert") pod "ingress-canary-gv9kf" (UID: "9b954b07-4736-4bf5-a073-457f98c06525") : secret "canary-serving-cert" not found Apr 16 22:14:13.863388 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:13.863351 2571 generic.go:358] "Generic (PLEG): container finished" podID="efd08e10-5da2-4f18-afe6-c78ed9bde562" containerID="0ffcd84e5069a67808bd279d354f1553ac0ec2039eeccce9f2df649cbf485cbd" exitCode=0 Apr 16 22:14:13.863726 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:13.863400 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fd5n6" event={"ID":"efd08e10-5da2-4f18-afe6-c78ed9bde562","Type":"ContainerDied","Data":"0ffcd84e5069a67808bd279d354f1553ac0ec2039eeccce9f2df649cbf485cbd"} Apr 16 22:14:14.867638 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:14.867579 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fd5n6" event={"ID":"efd08e10-5da2-4f18-afe6-c78ed9bde562","Type":"ContainerStarted","Data":"e4dd0079b91ae9c814a7f3eb410ab1a25d3e5a11b2727f044d967900bb6e34e6"} Apr 16 22:14:14.890259 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:14.890213 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fd5n6" podStartSLOduration=6.341143758 podStartE2EDuration="37.890199118s" podCreationTimestamp="2026-04-16 22:13:37 +0000 UTC" firstStartedPulling="2026-04-16 22:13:40.075011762 +0000 UTC m=+3.006336606" lastFinishedPulling="2026-04-16 22:14:11.624067103 +0000 UTC m=+34.555391966" observedRunningTime="2026-04-16 22:14:14.889003047 +0000 UTC m=+37.820327909" watchObservedRunningTime="2026-04-16 22:14:14.890199118 +0000 UTC m=+37.821523980" Apr 16 22:14:17.750324 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:17.750294 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:14:17.750701 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:17.750353 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:17.750701 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:17.750423 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:17.750701 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:17.750434 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:17.750701 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:17.750434 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:17.750701 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:17.750501 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert podName:9b954b07-4736-4bf5-a073-457f98c06525 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:25.750484453 +0000 UTC m=+48.681809313 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert") pod "ingress-canary-gv9kf" (UID: "9b954b07-4736-4bf5-a073-457f98c06525") : secret "canary-serving-cert" not found Apr 16 22:14:17.750701 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:17.750512 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:17.750701 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:17.750523 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76bb79884b-57jt7: secret "image-registry-tls" not found Apr 16 22:14:17.750701 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:17.750570 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls podName:b090c0b3-373b-4083-99b5-0851f1e3c94b nodeName:}" failed. No retries permitted until 2026-04-16 22:14:25.750546529 +0000 UTC m=+48.681871374 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls") pod "image-registry-76bb79884b-57jt7" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b") : secret "image-registry-tls" not found Apr 16 22:14:17.750701 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:17.750614 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls podName:011ad47b-a64a-4697-8f37-02cbc931d548 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:25.750595968 +0000 UTC m=+48.681920829 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls") pod "dns-default-wcg8z" (UID: "011ad47b-a64a-4697-8f37-02cbc931d548") : secret "dns-default-metrics-tls" not found Apr 16 22:14:17.850914 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:17.850876 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:14:17.854411 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:17.854386 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88aba50-e875-413f-a0d8-0887fad52a8e-original-pull-secret\") pod \"global-pull-secret-syncer-jng42\" (UID: \"a88aba50-e875-413f-a0d8-0887fad52a8e\") " pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:14:17.993143 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:17.993103 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jng42" Apr 16 22:14:18.187362 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:18.187333 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jng42"] Apr 16 22:14:18.190902 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:14:18.190875 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda88aba50_e875_413f_a0d8_0887fad52a8e.slice/crio-2c30ef0197a30984dac30cb59e1f9598b2a87bb54ccd1343df2905884dee1568 WatchSource:0}: Error finding container 2c30ef0197a30984dac30cb59e1f9598b2a87bb54ccd1343df2905884dee1568: Status 404 returned error can't find the container with id 2c30ef0197a30984dac30cb59e1f9598b2a87bb54ccd1343df2905884dee1568 Apr 16 22:14:18.876940 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:18.876900 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jng42" event={"ID":"a88aba50-e875-413f-a0d8-0887fad52a8e","Type":"ContainerStarted","Data":"2c30ef0197a30984dac30cb59e1f9598b2a87bb54ccd1343df2905884dee1568"} Apr 16 22:14:22.885792 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:22.885751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jng42" event={"ID":"a88aba50-e875-413f-a0d8-0887fad52a8e","Type":"ContainerStarted","Data":"4045e0c71f2b73725b8bf089af17de55ae30a5f9aa7295a91f279c1e48c26c2e"} Apr 16 22:14:22.900678 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:22.900617 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jng42" podStartSLOduration=34.005403507 podStartE2EDuration="37.900601231s" podCreationTimestamp="2026-04-16 22:13:45 +0000 UTC" firstStartedPulling="2026-04-16 22:14:18.192466037 +0000 UTC m=+41.123790877" lastFinishedPulling="2026-04-16 22:14:22.087663758 +0000 UTC m=+45.018988601" observedRunningTime="2026-04-16 22:14:22.900068859 +0000 UTC m=+45.831393721" watchObservedRunningTime="2026-04-16 22:14:22.900601231 +0000 UTC m=+45.831926094" Apr 16 22:14:25.800875 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:25.800839 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:14:25.801295 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:25.800898 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:25.801295 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:25.800953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:25.801295 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:25.800971 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:25.801295 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:25.800982 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:25.801295 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:25.801034 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:25.801295 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:25.801044 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76bb79884b-57jt7: secret "image-registry-tls" not found Apr 16 22:14:25.801295 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:25.801037 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert podName:9b954b07-4736-4bf5-a073-457f98c06525 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:41.801022708 +0000 UTC m=+64.732347551 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert") pod "ingress-canary-gv9kf" (UID: "9b954b07-4736-4bf5-a073-457f98c06525") : secret "canary-serving-cert" not found Apr 16 22:14:25.801295 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:25.801100 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls podName:011ad47b-a64a-4697-8f37-02cbc931d548 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:41.801083915 +0000 UTC m=+64.732408761 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls") pod "dns-default-wcg8z" (UID: "011ad47b-a64a-4697-8f37-02cbc931d548") : secret "dns-default-metrics-tls" not found Apr 16 22:14:25.801295 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:25.801120 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls podName:b090c0b3-373b-4083-99b5-0851f1e3c94b nodeName:}" failed. No retries permitted until 2026-04-16 22:14:41.801110654 +0000 UTC m=+64.732435494 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls") pod "image-registry-76bb79884b-57jt7" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b") : secret "image-registry-tls" not found Apr 16 22:14:34.845273 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:34.845243 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8trxs" Apr 16 22:14:41.816987 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:41.816947 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:14:41.817359 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:41.817005 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:14:41.817359 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:41.817043 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:14:41.817359 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:41.817092 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:41.817359 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:41.817119 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:41.817359 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:41.817130 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76bb79884b-57jt7: secret "image-registry-tls" not found Apr 16 22:14:41.817359 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:41.817144 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:41.817359 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:41.817160 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert podName:9b954b07-4736-4bf5-a073-457f98c06525 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:13.817143968 +0000 UTC m=+96.748468814 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert") pod "ingress-canary-gv9kf" (UID: "9b954b07-4736-4bf5-a073-457f98c06525") : secret "canary-serving-cert" not found Apr 16 22:14:41.817359 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:41.817175 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls podName:b090c0b3-373b-4083-99b5-0851f1e3c94b nodeName:}" failed. No retries permitted until 2026-04-16 22:15:13.81716976 +0000 UTC m=+96.748494599 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls") pod "image-registry-76bb79884b-57jt7" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b") : secret "image-registry-tls" not found Apr 16 22:14:41.817359 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:41.817194 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls podName:011ad47b-a64a-4697-8f37-02cbc931d548 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:13.817181493 +0000 UTC m=+96.748506337 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls") pod "dns-default-wcg8z" (UID: "011ad47b-a64a-4697-8f37-02cbc931d548") : secret "dns-default-metrics-tls" not found Apr 16 22:14:43.326503 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:43.326465 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:14:43.326896 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:43.326635 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:43.326896 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:14:43.326702 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs podName:e24f7f3c-00b2-43d5-9a49-1b7ee75125a1 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:47.326683221 +0000 UTC m=+130.258008061 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs") pod "network-metrics-daemon-2f4gk" (UID: "e24f7f3c-00b2-43d5-9a49-1b7ee75125a1") : secret "metrics-daemon-secret" not found Apr 16 22:14:43.427191 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:43.427153 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glg8w\" (UniqueName: \"kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w\") pod \"network-check-target-kh55g\" (UID: \"c06150de-115f-4d2c-8b2c-ec356592e26f\") " pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:14:43.429891 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:43.429872 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:43.439859 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:43.439837 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:43.450994 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:43.450973 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glg8w\" (UniqueName: \"kubernetes.io/projected/c06150de-115f-4d2c-8b2c-ec356592e26f-kube-api-access-glg8w\") pod \"network-check-target-kh55g\" (UID: \"c06150de-115f-4d2c-8b2c-ec356592e26f\") " pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:14:43.489622 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:43.489599 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-2r56c\"" Apr 16 22:14:43.497470 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:43.497454 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:14:43.606500 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:43.606437 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kh55g"] Apr 16 22:14:43.609583 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:14:43.609543 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc06150de_115f_4d2c_8b2c_ec356592e26f.slice/crio-f8d29f496f603924e81d6e470e1e9f6fc43c6402966f40c5b58570205e208bb1 WatchSource:0}: Error finding container f8d29f496f603924e81d6e470e1e9f6fc43c6402966f40c5b58570205e208bb1: Status 404 returned error can't find the container with id f8d29f496f603924e81d6e470e1e9f6fc43c6402966f40c5b58570205e208bb1 Apr 16 22:14:43.923212 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:43.923133 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kh55g" event={"ID":"c06150de-115f-4d2c-8b2c-ec356592e26f","Type":"ContainerStarted","Data":"f8d29f496f603924e81d6e470e1e9f6fc43c6402966f40c5b58570205e208bb1"} Apr 16 22:14:46.930223 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:46.930191 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kh55g" event={"ID":"c06150de-115f-4d2c-8b2c-ec356592e26f","Type":"ContainerStarted","Data":"9ac6025ef8eb2e5237a06143f72afb0375f79444f87f16560a39fc9020bda154"} Apr 16 22:14:46.930691 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:14:46.930313 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:15:13.839310 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:13.839260 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:15:13.839310 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:13.839321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:15:13.839924 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:13.839353 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:15:13.839924 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:13.839430 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:15:13.839924 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:13.839456 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:15:13.839924 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:13.839472 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:13.839924 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:13.839487 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76bb79884b-57jt7: secret "image-registry-tls" not found Apr 16 22:15:13.839924 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:13.839512 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert podName:9b954b07-4736-4bf5-a073-457f98c06525 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:17.839493059 +0000 UTC m=+160.770817899 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert") pod "ingress-canary-gv9kf" (UID: "9b954b07-4736-4bf5-a073-457f98c06525") : secret "canary-serving-cert" not found Apr 16 22:15:13.839924 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:13.839531 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls podName:b090c0b3-373b-4083-99b5-0851f1e3c94b nodeName:}" failed. No retries permitted until 2026-04-16 22:16:17.839518749 +0000 UTC m=+160.770843596 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls") pod "image-registry-76bb79884b-57jt7" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b") : secret "image-registry-tls" not found Apr 16 22:15:13.839924 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:13.839545 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls podName:011ad47b-a64a-4697-8f37-02cbc931d548 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:17.839538698 +0000 UTC m=+160.770863542 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls") pod "dns-default-wcg8z" (UID: "011ad47b-a64a-4697-8f37-02cbc931d548") : secret "dns-default-metrics-tls" not found Apr 16 22:15:17.934147 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:17.934118 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kh55g" Apr 16 22:15:17.950426 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:17.950390 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kh55g" podStartSLOduration=98.227264236 podStartE2EDuration="1m40.950377727s" podCreationTimestamp="2026-04-16 22:13:37 +0000 UTC" firstStartedPulling="2026-04-16 22:14:43.611475822 +0000 UTC m=+66.542800669" lastFinishedPulling="2026-04-16 22:14:46.334589301 +0000 UTC m=+69.265914160" observedRunningTime="2026-04-16 22:14:46.945806171 +0000 UTC m=+69.877131033" watchObservedRunningTime="2026-04-16 22:15:17.950377727 +0000 UTC m=+100.881702583" Apr 16 22:15:47.369609 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:47.369537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:15:47.370191 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:47.369691 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:15:47.370191 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:47.369767 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs podName:e24f7f3c-00b2-43d5-9a49-1b7ee75125a1 nodeName:}" failed. No retries permitted until 2026-04-16 22:17:49.369750953 +0000 UTC m=+252.301075793 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs") pod "network-metrics-daemon-2f4gk" (UID: "e24f7f3c-00b2-43d5-9a49-1b7ee75125a1") : secret "metrics-daemon-secret" not found Apr 16 22:15:54.672608 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.672567 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5"] Apr 16 22:15:54.675270 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.675254 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ktkhc"] Apr 16 22:15:54.675433 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.675413 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" Apr 16 22:15:54.677669 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.677646 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:54.678164 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.678145 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:54.678585 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.678569 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 22:15:54.678665 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.678651 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-6qvt5\"" Apr 16 22:15:54.678702 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.678654 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:54.680108 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.680079 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 22:15:54.680245 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.680228 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 22:15:54.680412 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.680400 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-4qfzc\"" Apr 16 22:15:54.680678 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.680654 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:54.681316 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.681283 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:54.684024 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.684004 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5"] Apr 16 22:15:54.685979 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.685959 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 22:15:54.686420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.686404 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ktkhc"] Apr 16 22:15:54.772215 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.772185 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp"] Apr 16 22:15:54.774999 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.774985 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" Apr 16 22:15:54.778104 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.778082 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 22:15:54.778231 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.778212 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:54.778341 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.778294 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:54.778341 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.778212 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 22:15:54.778494 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.778480 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-h8gmc\"" Apr 16 22:15:54.780959 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.780939 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t"] Apr 16 22:15:54.783727 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.783712 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:15:54.785614 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.785591 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp"] Apr 16 22:15:54.786151 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.786135 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 22:15:54.787007 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.786988 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:15:54.787323 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.787306 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:15:54.787323 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.787316 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-74f6j\"" Apr 16 22:15:54.787450 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.787371 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 22:15:54.797290 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.797271 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t"] Apr 16 22:15:54.822115 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.822094 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bg2j\" (UniqueName: \"kubernetes.io/projected/29cdf665-3a36-4a09-a77b-299bff99a6ac-kube-api-access-8bg2j\") pod \"cluster-samples-operator-6dc5bdb6b4-6jtg5\" (UID: \"29cdf665-3a36-4a09-a77b-299bff99a6ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" Apr 16 22:15:54.822214 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.822135 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6jtg5\" (UID: \"29cdf665-3a36-4a09-a77b-299bff99a6ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" Apr 16 22:15:54.822214 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.822164 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8scs\" (UniqueName: \"kubernetes.io/projected/c1491aea-f867-4bd4-ab58-776381aad953-kube-api-access-b8scs\") pod \"console-operator-9d4b6777b-ktkhc\" (UID: \"c1491aea-f867-4bd4-ab58-776381aad953\") " pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:54.822214 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.822190 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1491aea-f867-4bd4-ab58-776381aad953-config\") pod \"console-operator-9d4b6777b-ktkhc\" (UID: \"c1491aea-f867-4bd4-ab58-776381aad953\") " pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:54.822321 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.822228 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1491aea-f867-4bd4-ab58-776381aad953-serving-cert\") pod \"console-operator-9d4b6777b-ktkhc\" (UID: \"c1491aea-f867-4bd4-ab58-776381aad953\") " pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:54.822321 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.822247 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1491aea-f867-4bd4-ab58-776381aad953-trusted-ca\") pod \"console-operator-9d4b6777b-ktkhc\" (UID: \"c1491aea-f867-4bd4-ab58-776381aad953\") " pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:54.873409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.873378 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7hlrt"] Apr 16 22:15:54.877015 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.876990 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7hlrt" Apr 16 22:15:54.879406 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.879389 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-nrwq4\"" Apr 16 22:15:54.884258 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.884237 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7hlrt"] Apr 16 22:15:54.922540 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.922516 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6jtg5\" (UID: \"29cdf665-3a36-4a09-a77b-299bff99a6ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" Apr 16 22:15:54.922663 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.922578 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8scs\" (UniqueName: \"kubernetes.io/projected/c1491aea-f867-4bd4-ab58-776381aad953-kube-api-access-b8scs\") pod \"console-operator-9d4b6777b-ktkhc\" (UID: \"c1491aea-f867-4bd4-ab58-776381aad953\") " pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:54.922713 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:54.922668 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:54.922765 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:54.922720 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls podName:29cdf665-3a36-4a09-a77b-299bff99a6ac nodeName:}" failed. No retries permitted until 2026-04-16 22:15:55.422702189 +0000 UTC m=+138.354027033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6jtg5" (UID: "29cdf665-3a36-4a09-a77b-299bff99a6ac") : secret "samples-operator-tls" not found Apr 16 22:15:54.922765 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.922748 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6zfh\" (UniqueName: \"kubernetes.io/projected/8fbcc451-603b-46ab-be67-fab197b0645c-kube-api-access-q6zfh\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:15:54.922844 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.922782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1491aea-f867-4bd4-ab58-776381aad953-config\") pod \"console-operator-9d4b6777b-ktkhc\" (UID: \"c1491aea-f867-4bd4-ab58-776381aad953\") " pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:54.922844 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.922807 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e9979f-4063-42fa-aa5b-a0d2d60f93a5-config\") pod \"service-ca-operator-d6fc45fc5-lntwp\" (UID: \"c7e9979f-4063-42fa-aa5b-a0d2d60f93a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" Apr 16 22:15:54.922844 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.922834 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:15:54.922951 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.922864 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e9979f-4063-42fa-aa5b-a0d2d60f93a5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lntwp\" (UID: \"c7e9979f-4063-42fa-aa5b-a0d2d60f93a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" Apr 16 22:15:54.922951 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.922916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1491aea-f867-4bd4-ab58-776381aad953-serving-cert\") pod \"console-operator-9d4b6777b-ktkhc\" (UID: \"c1491aea-f867-4bd4-ab58-776381aad953\") " pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:54.922951 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.922938 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8fbcc451-603b-46ab-be67-fab197b0645c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:15:54.923057 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.922963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1491aea-f867-4bd4-ab58-776381aad953-trusted-ca\") pod \"console-operator-9d4b6777b-ktkhc\" (UID: \"c1491aea-f867-4bd4-ab58-776381aad953\") " pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:54.923057 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.922995 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmrgj\" (UniqueName: \"kubernetes.io/projected/c7e9979f-4063-42fa-aa5b-a0d2d60f93a5-kube-api-access-qmrgj\") pod \"service-ca-operator-d6fc45fc5-lntwp\" (UID: \"c7e9979f-4063-42fa-aa5b-a0d2d60f93a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" Apr 16 22:15:54.923057 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.923045 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bg2j\" (UniqueName: \"kubernetes.io/projected/29cdf665-3a36-4a09-a77b-299bff99a6ac-kube-api-access-8bg2j\") pod \"cluster-samples-operator-6dc5bdb6b4-6jtg5\" (UID: \"29cdf665-3a36-4a09-a77b-299bff99a6ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" Apr 16 22:15:54.923872 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.923852 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1491aea-f867-4bd4-ab58-776381aad953-config\") pod \"console-operator-9d4b6777b-ktkhc\" (UID: \"c1491aea-f867-4bd4-ab58-776381aad953\") " pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:54.924198 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.924177 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1491aea-f867-4bd4-ab58-776381aad953-trusted-ca\") pod \"console-operator-9d4b6777b-ktkhc\" (UID: \"c1491aea-f867-4bd4-ab58-776381aad953\") " pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:54.925741 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.925723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1491aea-f867-4bd4-ab58-776381aad953-serving-cert\") pod \"console-operator-9d4b6777b-ktkhc\" (UID: \"c1491aea-f867-4bd4-ab58-776381aad953\") " pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:54.933565 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.933535 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8scs\" (UniqueName: \"kubernetes.io/projected/c1491aea-f867-4bd4-ab58-776381aad953-kube-api-access-b8scs\") pod \"console-operator-9d4b6777b-ktkhc\" (UID: \"c1491aea-f867-4bd4-ab58-776381aad953\") " pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:54.933639 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.933588 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bg2j\" (UniqueName: \"kubernetes.io/projected/29cdf665-3a36-4a09-a77b-299bff99a6ac-kube-api-access-8bg2j\") pod \"cluster-samples-operator-6dc5bdb6b4-6jtg5\" (UID: \"29cdf665-3a36-4a09-a77b-299bff99a6ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" Apr 16 22:15:54.991514 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:54.991482 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:15:55.023699 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.023664 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8fbcc451-603b-46ab-be67-fab197b0645c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:15:55.023821 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.023712 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmrgj\" (UniqueName: \"kubernetes.io/projected/c7e9979f-4063-42fa-aa5b-a0d2d60f93a5-kube-api-access-qmrgj\") pod \"service-ca-operator-d6fc45fc5-lntwp\" (UID: \"c7e9979f-4063-42fa-aa5b-a0d2d60f93a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" Apr 16 22:15:55.023884 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.023866 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e9979f-4063-42fa-aa5b-a0d2d60f93a5-config\") pod \"service-ca-operator-d6fc45fc5-lntwp\" (UID: \"c7e9979f-4063-42fa-aa5b-a0d2d60f93a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" Apr 16 22:15:55.023944 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.023910 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6zfh\" (UniqueName: \"kubernetes.io/projected/8fbcc451-603b-46ab-be67-fab197b0645c-kube-api-access-q6zfh\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:15:55.023996 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.023947 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6fcl\" (UniqueName: \"kubernetes.io/projected/a3be75ea-4ff5-4062-8e69-d6fdf589b369-kube-api-access-j6fcl\") pod \"network-check-source-8894fc9bd-7hlrt\" (UID: \"a3be75ea-4ff5-4062-8e69-d6fdf589b369\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7hlrt" Apr 16 22:15:55.024163 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.024139 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:15:55.024280 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.024265 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e9979f-4063-42fa-aa5b-a0d2d60f93a5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lntwp\" (UID: \"c7e9979f-4063-42fa-aa5b-a0d2d60f93a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" Apr 16 22:15:55.024404 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:55.024384 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:55.024464 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:55.024449 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls podName:8fbcc451-603b-46ab-be67-fab197b0645c nodeName:}" failed. No retries permitted until 2026-04-16 22:15:55.524431156 +0000 UTC m=+138.455756006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kr46t" (UID: "8fbcc451-603b-46ab-be67-fab197b0645c") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:55.024526 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.024461 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8fbcc451-603b-46ab-be67-fab197b0645c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:15:55.024526 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.024511 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e9979f-4063-42fa-aa5b-a0d2d60f93a5-config\") pod \"service-ca-operator-d6fc45fc5-lntwp\" (UID: \"c7e9979f-4063-42fa-aa5b-a0d2d60f93a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" Apr 16 22:15:55.026868 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.026845 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7e9979f-4063-42fa-aa5b-a0d2d60f93a5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lntwp\" (UID: \"c7e9979f-4063-42fa-aa5b-a0d2d60f93a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" Apr 16 22:15:55.034303 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.034257 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6zfh\" (UniqueName: \"kubernetes.io/projected/8fbcc451-603b-46ab-be67-fab197b0645c-kube-api-access-q6zfh\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:15:55.035052 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.035032 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmrgj\" (UniqueName: \"kubernetes.io/projected/c7e9979f-4063-42fa-aa5b-a0d2d60f93a5-kube-api-access-qmrgj\") pod \"service-ca-operator-d6fc45fc5-lntwp\" (UID: \"c7e9979f-4063-42fa-aa5b-a0d2d60f93a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" Apr 16 22:15:55.083849 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.083823 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" Apr 16 22:15:55.103757 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.103724 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ktkhc"] Apr 16 22:15:55.106821 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:15:55.106788 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1491aea_f867_4bd4_ab58_776381aad953.slice/crio-eaef7f2e51b6011898b67e472fedb07e8928555c3e0503ee0137428564f1c738 WatchSource:0}: Error finding container eaef7f2e51b6011898b67e472fedb07e8928555c3e0503ee0137428564f1c738: Status 404 returned error can't find the container with id eaef7f2e51b6011898b67e472fedb07e8928555c3e0503ee0137428564f1c738 Apr 16 22:15:55.125385 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.125361 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6fcl\" (UniqueName: \"kubernetes.io/projected/a3be75ea-4ff5-4062-8e69-d6fdf589b369-kube-api-access-j6fcl\") pod \"network-check-source-8894fc9bd-7hlrt\" (UID: \"a3be75ea-4ff5-4062-8e69-d6fdf589b369\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7hlrt" Apr 16 22:15:55.134938 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.134910 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6fcl\" (UniqueName: \"kubernetes.io/projected/a3be75ea-4ff5-4062-8e69-d6fdf589b369-kube-api-access-j6fcl\") pod \"network-check-source-8894fc9bd-7hlrt\" (UID: \"a3be75ea-4ff5-4062-8e69-d6fdf589b369\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7hlrt" Apr 16 22:15:55.185872 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.185807 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7hlrt" Apr 16 22:15:55.194091 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.194066 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp"] Apr 16 22:15:55.196855 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:15:55.196832 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7e9979f_4063_42fa_aa5b_a0d2d60f93a5.slice/crio-09633cd12bc66ce96f21d2e28b847353e47d980aae9b78063a8228ce98cd161d WatchSource:0}: Error finding container 09633cd12bc66ce96f21d2e28b847353e47d980aae9b78063a8228ce98cd161d: Status 404 returned error can't find the container with id 09633cd12bc66ce96f21d2e28b847353e47d980aae9b78063a8228ce98cd161d Apr 16 22:15:55.296305 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.296275 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7hlrt"] Apr 16 22:15:55.298852 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:15:55.298829 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3be75ea_4ff5_4062_8e69_d6fdf589b369.slice/crio-cbe6254ce6cd8cd82e2f49904b2bac815444529ba856e461f28b68b7e1a7d211 WatchSource:0}: Error finding container cbe6254ce6cd8cd82e2f49904b2bac815444529ba856e461f28b68b7e1a7d211: Status 404 returned error can't find the container with id cbe6254ce6cd8cd82e2f49904b2bac815444529ba856e461f28b68b7e1a7d211 Apr 16 22:15:55.427834 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.427800 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6jtg5\" (UID: \"29cdf665-3a36-4a09-a77b-299bff99a6ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" Apr 16 22:15:55.427997 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:55.427977 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:55.428112 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:55.428055 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls podName:29cdf665-3a36-4a09-a77b-299bff99a6ac nodeName:}" failed. No retries permitted until 2026-04-16 22:15:56.428033147 +0000 UTC m=+139.359357994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6jtg5" (UID: "29cdf665-3a36-4a09-a77b-299bff99a6ac") : secret "samples-operator-tls" not found Apr 16 22:15:55.528765 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:55.528731 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:15:55.528919 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:55.528883 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:55.528962 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:55.528944 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls podName:8fbcc451-603b-46ab-be67-fab197b0645c nodeName:}" failed. No retries permitted until 2026-04-16 22:15:56.528928876 +0000 UTC m=+139.460253721 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kr46t" (UID: "8fbcc451-603b-46ab-be67-fab197b0645c") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:56.058150 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:56.058110 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7hlrt" event={"ID":"a3be75ea-4ff5-4062-8e69-d6fdf589b369","Type":"ContainerStarted","Data":"d0cc0dab082f33dcdd4064dea99879f64cc401f572c22d3cd93776dbad69f419"} Apr 16 22:15:56.058630 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:56.058158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7hlrt" event={"ID":"a3be75ea-4ff5-4062-8e69-d6fdf589b369","Type":"ContainerStarted","Data":"cbe6254ce6cd8cd82e2f49904b2bac815444529ba856e461f28b68b7e1a7d211"} Apr 16 22:15:56.060207 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:56.060178 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" event={"ID":"c1491aea-f867-4bd4-ab58-776381aad953","Type":"ContainerStarted","Data":"eaef7f2e51b6011898b67e472fedb07e8928555c3e0503ee0137428564f1c738"} Apr 16 22:15:56.061856 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:56.061830 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" event={"ID":"c7e9979f-4063-42fa-aa5b-a0d2d60f93a5","Type":"ContainerStarted","Data":"09633cd12bc66ce96f21d2e28b847353e47d980aae9b78063a8228ce98cd161d"} Apr 16 22:15:56.074230 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:56.074188 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7hlrt" podStartSLOduration=2.074174217 podStartE2EDuration="2.074174217s" podCreationTimestamp="2026-04-16 22:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:15:56.073417982 +0000 UTC m=+139.004742845" watchObservedRunningTime="2026-04-16 22:15:56.074174217 +0000 UTC m=+139.005499079" Apr 16 22:15:56.438218 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:56.438137 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6jtg5\" (UID: \"29cdf665-3a36-4a09-a77b-299bff99a6ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" Apr 16 22:15:56.438378 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:56.438311 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:56.438424 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:56.438387 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls podName:29cdf665-3a36-4a09-a77b-299bff99a6ac nodeName:}" failed. No retries permitted until 2026-04-16 22:15:58.438368085 +0000 UTC m=+141.369692944 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6jtg5" (UID: "29cdf665-3a36-4a09-a77b-299bff99a6ac") : secret "samples-operator-tls" not found Apr 16 22:15:56.539435 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:56.539391 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:15:56.539617 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:56.539540 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:56.539693 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:56.539629 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls podName:8fbcc451-603b-46ab-be67-fab197b0645c nodeName:}" failed. No retries permitted until 2026-04-16 22:15:58.539608607 +0000 UTC m=+141.470933454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kr46t" (UID: "8fbcc451-603b-46ab-be67-fab197b0645c") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:58.067435 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.067407 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/0.log" Apr 16 22:15:58.067890 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.067446 2571 generic.go:358] "Generic (PLEG): container finished" podID="c1491aea-f867-4bd4-ab58-776381aad953" containerID="e5e7ea7fe1cde6dde78fc1dd863386129cd2a1444096f72cd8fed516afa16ebd" exitCode=255 Apr 16 22:15:58.067890 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.067480 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" event={"ID":"c1491aea-f867-4bd4-ab58-776381aad953","Type":"ContainerDied","Data":"e5e7ea7fe1cde6dde78fc1dd863386129cd2a1444096f72cd8fed516afa16ebd"} Apr 16 22:15:58.067890 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.067819 2571 scope.go:117] "RemoveContainer" containerID="e5e7ea7fe1cde6dde78fc1dd863386129cd2a1444096f72cd8fed516afa16ebd" Apr 16 22:15:58.068897 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.068867 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" event={"ID":"c7e9979f-4063-42fa-aa5b-a0d2d60f93a5","Type":"ContainerStarted","Data":"8d049bcd1a9293f5cecdb850eeece2c08af48d454f66871e93b80f7250fb4413"} Apr 16 22:15:58.097087 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.097046 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" podStartSLOduration=1.884469203 podStartE2EDuration="4.09703273s" podCreationTimestamp="2026-04-16 22:15:54 +0000 UTC" firstStartedPulling="2026-04-16 22:15:55.198647624 +0000 UTC m=+138.129972478" lastFinishedPulling="2026-04-16 22:15:57.411211165 +0000 UTC m=+140.342536005" observedRunningTime="2026-04-16 22:15:58.096697896 +0000 UTC m=+141.028022757" watchObservedRunningTime="2026-04-16 22:15:58.09703273 +0000 UTC m=+141.028357591" Apr 16 22:15:58.455025 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.454932 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6jtg5\" (UID: \"29cdf665-3a36-4a09-a77b-299bff99a6ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" Apr 16 22:15:58.455158 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:58.455069 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:58.455158 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:58.455143 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls podName:29cdf665-3a36-4a09-a77b-299bff99a6ac nodeName:}" failed. No retries permitted until 2026-04-16 22:16:02.45512782 +0000 UTC m=+145.386452663 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6jtg5" (UID: "29cdf665-3a36-4a09-a77b-299bff99a6ac") : secret "samples-operator-tls" not found Apr 16 22:15:58.556103 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.556069 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:15:58.556284 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:58.556196 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:58.556284 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:58.556255 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls podName:8fbcc451-603b-46ab-be67-fab197b0645c nodeName:}" failed. No retries permitted until 2026-04-16 22:16:02.556240655 +0000 UTC m=+145.487565497 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kr46t" (UID: "8fbcc451-603b-46ab-be67-fab197b0645c") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:58.848612 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.848579 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7g4vw"] Apr 16 22:15:58.851480 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.851464 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7g4vw" Apr 16 22:15:58.857590 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.857565 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwjct\" (UniqueName: \"kubernetes.io/projected/adbbb6b8-3903-47df-997c-6351eef0c24d-kube-api-access-cwjct\") pod \"migrator-74bb7799d9-7g4vw\" (UID: \"adbbb6b8-3903-47df-997c-6351eef0c24d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7g4vw" Apr 16 22:15:58.857721 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.857688 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-68l4l\"" Apr 16 22:15:58.857827 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.857750 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 22:15:58.857827 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.857784 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:58.864456 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.864438 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7g4vw"] Apr 16 22:15:58.958535 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.958503 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwjct\" (UniqueName: \"kubernetes.io/projected/adbbb6b8-3903-47df-997c-6351eef0c24d-kube-api-access-cwjct\") pod \"migrator-74bb7799d9-7g4vw\" (UID: \"adbbb6b8-3903-47df-997c-6351eef0c24d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7g4vw" Apr 16 22:15:58.974996 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:58.974974 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwjct\" (UniqueName: \"kubernetes.io/projected/adbbb6b8-3903-47df-997c-6351eef0c24d-kube-api-access-cwjct\") pod \"migrator-74bb7799d9-7g4vw\" (UID: \"adbbb6b8-3903-47df-997c-6351eef0c24d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7g4vw" Apr 16 22:15:59.072376 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:59.072355 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/1.log" Apr 16 22:15:59.072757 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:59.072743 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/0.log" Apr 16 22:15:59.072850 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:59.072785 2571 generic.go:358] "Generic (PLEG): container finished" podID="c1491aea-f867-4bd4-ab58-776381aad953" containerID="54ea1a8e540b08455924788e5b0a90d974f084e7f689310d615302093f6910eb" exitCode=255 Apr 16 22:15:59.072906 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:59.072878 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" event={"ID":"c1491aea-f867-4bd4-ab58-776381aad953","Type":"ContainerDied","Data":"54ea1a8e540b08455924788e5b0a90d974f084e7f689310d615302093f6910eb"} Apr 16 22:15:59.072953 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:59.072916 2571 scope.go:117] "RemoveContainer" containerID="e5e7ea7fe1cde6dde78fc1dd863386129cd2a1444096f72cd8fed516afa16ebd" Apr 16 22:15:59.073143 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:59.073121 2571 scope.go:117] "RemoveContainer" containerID="54ea1a8e540b08455924788e5b0a90d974f084e7f689310d615302093f6910eb" Apr 16 22:15:59.073382 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:15:59.073363 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ktkhc_openshift-console-operator(c1491aea-f867-4bd4-ab58-776381aad953)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" podUID="c1491aea-f867-4bd4-ab58-776381aad953" Apr 16 22:15:59.160264 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:59.160200 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7g4vw" Apr 16 22:15:59.273723 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:15:59.273594 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7g4vw"] Apr 16 22:15:59.276016 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:15:59.275989 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadbbb6b8_3903_47df_997c_6351eef0c24d.slice/crio-a2cf4efab7d21343b25e8dcfacd16ca3674ed881d82ef4f18c5fcea336736bff WatchSource:0}: Error finding container a2cf4efab7d21343b25e8dcfacd16ca3674ed881d82ef4f18c5fcea336736bff: Status 404 returned error can't find the container with id a2cf4efab7d21343b25e8dcfacd16ca3674ed881d82ef4f18c5fcea336736bff Apr 16 22:16:00.076895 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:00.076866 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/1.log" Apr 16 22:16:00.077352 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:00.077254 2571 scope.go:117] "RemoveContainer" containerID="54ea1a8e540b08455924788e5b0a90d974f084e7f689310d615302093f6910eb" Apr 16 22:16:00.077484 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:00.077461 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ktkhc_openshift-console-operator(c1491aea-f867-4bd4-ab58-776381aad953)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" podUID="c1491aea-f867-4bd4-ab58-776381aad953" Apr 16 22:16:00.078093 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:00.078069 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7g4vw" event={"ID":"adbbb6b8-3903-47df-997c-6351eef0c24d","Type":"ContainerStarted","Data":"a2cf4efab7d21343b25e8dcfacd16ca3674ed881d82ef4f18c5fcea336736bff"} Apr 16 22:16:00.345656 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:00.345569 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bz2qc_7eb428a4-211d-4444-991e-d8b3dac28ddf/dns-node-resolver/0.log" Apr 16 22:16:01.082596 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.082544 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7g4vw" event={"ID":"adbbb6b8-3903-47df-997c-6351eef0c24d","Type":"ContainerStarted","Data":"95b753bfe595b811f3f848842362760f31a0d70d22315724b64e4af47bb2e874"} Apr 16 22:16:01.082596 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.082600 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7g4vw" event={"ID":"adbbb6b8-3903-47df-997c-6351eef0c24d","Type":"ContainerStarted","Data":"1f12ae7783f2c99035d620af0ea6ba4f81bd012d1572a218e22ca04d2b823489"} Apr 16 22:16:01.101255 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.101205 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7g4vw" podStartSLOduration=1.9765944659999999 podStartE2EDuration="3.101191761s" podCreationTimestamp="2026-04-16 22:15:58 +0000 UTC" firstStartedPulling="2026-04-16 22:15:59.277806789 +0000 UTC m=+142.209131630" lastFinishedPulling="2026-04-16 22:16:00.402404072 +0000 UTC m=+143.333728925" observedRunningTime="2026-04-16 22:16:01.099892798 +0000 UTC m=+144.031217658" watchObservedRunningTime="2026-04-16 22:16:01.101191761 +0000 UTC m=+144.032516653" Apr 16 22:16:01.150438 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.150411 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4mcb4_5c62daff-6789-4383-b4d0-6b51a07c06bb/node-ca/0.log" Apr 16 22:16:01.180919 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.180882 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-4n554"] Apr 16 22:16:01.183960 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.183938 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-4n554" Apr 16 22:16:01.186243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.186196 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 22:16:01.186243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.186223 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 22:16:01.186399 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.186279 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 22:16:01.186399 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.186329 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-l676z\"" Apr 16 22:16:01.186497 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.186484 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 22:16:01.191644 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.191622 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-4n554"] Apr 16 22:16:01.278644 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.278611 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6ba9efdf-adc7-4b01-ab1c-558296f445ab-signing-cabundle\") pod \"service-ca-865cb79987-4n554\" (UID: \"6ba9efdf-adc7-4b01-ab1c-558296f445ab\") " pod="openshift-service-ca/service-ca-865cb79987-4n554" Apr 16 22:16:01.278644 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.278648 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6ba9efdf-adc7-4b01-ab1c-558296f445ab-signing-key\") pod \"service-ca-865cb79987-4n554\" (UID: \"6ba9efdf-adc7-4b01-ab1c-558296f445ab\") " pod="openshift-service-ca/service-ca-865cb79987-4n554" Apr 16 22:16:01.278859 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.278690 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnmf4\" (UniqueName: \"kubernetes.io/projected/6ba9efdf-adc7-4b01-ab1c-558296f445ab-kube-api-access-lnmf4\") pod \"service-ca-865cb79987-4n554\" (UID: \"6ba9efdf-adc7-4b01-ab1c-558296f445ab\") " pod="openshift-service-ca/service-ca-865cb79987-4n554" Apr 16 22:16:01.379620 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.379491 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnmf4\" (UniqueName: \"kubernetes.io/projected/6ba9efdf-adc7-4b01-ab1c-558296f445ab-kube-api-access-lnmf4\") pod \"service-ca-865cb79987-4n554\" (UID: \"6ba9efdf-adc7-4b01-ab1c-558296f445ab\") " pod="openshift-service-ca/service-ca-865cb79987-4n554" Apr 16 22:16:01.379781 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.379633 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6ba9efdf-adc7-4b01-ab1c-558296f445ab-signing-cabundle\") pod \"service-ca-865cb79987-4n554\" (UID: \"6ba9efdf-adc7-4b01-ab1c-558296f445ab\") " pod="openshift-service-ca/service-ca-865cb79987-4n554" Apr 16 22:16:01.379781 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.379660 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6ba9efdf-adc7-4b01-ab1c-558296f445ab-signing-key\") pod \"service-ca-865cb79987-4n554\" (UID: \"6ba9efdf-adc7-4b01-ab1c-558296f445ab\") " pod="openshift-service-ca/service-ca-865cb79987-4n554" Apr 16 22:16:01.380319 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.380292 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6ba9efdf-adc7-4b01-ab1c-558296f445ab-signing-cabundle\") pod \"service-ca-865cb79987-4n554\" (UID: \"6ba9efdf-adc7-4b01-ab1c-558296f445ab\") " pod="openshift-service-ca/service-ca-865cb79987-4n554" Apr 16 22:16:01.381965 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.381947 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6ba9efdf-adc7-4b01-ab1c-558296f445ab-signing-key\") pod \"service-ca-865cb79987-4n554\" (UID: \"6ba9efdf-adc7-4b01-ab1c-558296f445ab\") " pod="openshift-service-ca/service-ca-865cb79987-4n554" Apr 16 22:16:01.386953 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.386934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnmf4\" (UniqueName: \"kubernetes.io/projected/6ba9efdf-adc7-4b01-ab1c-558296f445ab-kube-api-access-lnmf4\") pod \"service-ca-865cb79987-4n554\" (UID: \"6ba9efdf-adc7-4b01-ab1c-558296f445ab\") " pod="openshift-service-ca/service-ca-865cb79987-4n554" Apr 16 22:16:01.492980 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.492938 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-4n554" Apr 16 22:16:01.605896 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:01.605847 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-4n554"] Apr 16 22:16:01.609361 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:01.609328 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ba9efdf_adc7_4b01_ab1c_558296f445ab.slice/crio-aef348a376f97d1762f49dbd2a19ec0060bbc3edaf21585f7a3d473e24dd0a1e WatchSource:0}: Error finding container aef348a376f97d1762f49dbd2a19ec0060bbc3edaf21585f7a3d473e24dd0a1e: Status 404 returned error can't find the container with id aef348a376f97d1762f49dbd2a19ec0060bbc3edaf21585f7a3d473e24dd0a1e Apr 16 22:16:02.087161 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:02.087128 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-4n554" event={"ID":"6ba9efdf-adc7-4b01-ab1c-558296f445ab","Type":"ContainerStarted","Data":"6eed11ead8af5de510392345faa70b60bff0aa46005aeaa96de0e0e194d7d6c3"} Apr 16 22:16:02.087591 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:02.087170 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-4n554" event={"ID":"6ba9efdf-adc7-4b01-ab1c-558296f445ab","Type":"ContainerStarted","Data":"aef348a376f97d1762f49dbd2a19ec0060bbc3edaf21585f7a3d473e24dd0a1e"} Apr 16 22:16:02.104449 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:02.104400 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-4n554" podStartSLOduration=1.104384521 podStartE2EDuration="1.104384521s" podCreationTimestamp="2026-04-16 22:16:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:02.103431925 +0000 UTC m=+145.034756791" watchObservedRunningTime="2026-04-16 22:16:02.104384521 +0000 UTC m=+145.035709383" Apr 16 22:16:02.489125 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:02.489033 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6jtg5\" (UID: \"29cdf665-3a36-4a09-a77b-299bff99a6ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" Apr 16 22:16:02.489322 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:02.489176 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:16:02.489322 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:02.489244 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls podName:29cdf665-3a36-4a09-a77b-299bff99a6ac nodeName:}" failed. No retries permitted until 2026-04-16 22:16:10.489229318 +0000 UTC m=+153.420554160 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6jtg5" (UID: "29cdf665-3a36-4a09-a77b-299bff99a6ac") : secret "samples-operator-tls" not found Apr 16 22:16:02.590427 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:02.590389 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:16:02.590636 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:02.590498 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:02.590636 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:02.590574 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls podName:8fbcc451-603b-46ab-be67-fab197b0645c nodeName:}" failed. No retries permitted until 2026-04-16 22:16:10.59053957 +0000 UTC m=+153.521864410 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kr46t" (UID: "8fbcc451-603b-46ab-be67-fab197b0645c") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:04.993076 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:04.993015 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:16:04.993414 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:04.993087 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:16:04.993583 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:04.993565 2571 scope.go:117] "RemoveContainer" containerID="54ea1a8e540b08455924788e5b0a90d974f084e7f689310d615302093f6910eb" Apr 16 22:16:04.993794 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:04.993775 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ktkhc_openshift-console-operator(c1491aea-f867-4bd4-ab58-776381aad953)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" podUID="c1491aea-f867-4bd4-ab58-776381aad953" Apr 16 22:16:10.559154 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:10.559122 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6jtg5\" (UID: \"29cdf665-3a36-4a09-a77b-299bff99a6ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" Apr 16 22:16:10.561394 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:10.561370 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/29cdf665-3a36-4a09-a77b-299bff99a6ac-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6jtg5\" (UID: \"29cdf665-3a36-4a09-a77b-299bff99a6ac\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" Apr 16 22:16:10.586218 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:10.586196 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" Apr 16 22:16:10.660104 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:10.660071 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:16:10.660343 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:10.660231 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:10.660343 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:10.660313 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls podName:8fbcc451-603b-46ab-be67-fab197b0645c nodeName:}" failed. No retries permitted until 2026-04-16 22:16:26.660291908 +0000 UTC m=+169.591616752 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kr46t" (UID: "8fbcc451-603b-46ab-be67-fab197b0645c") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:10.703495 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:10.703470 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5"] Apr 16 22:16:11.110973 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:11.110924 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" event={"ID":"29cdf665-3a36-4a09-a77b-299bff99a6ac","Type":"ContainerStarted","Data":"6c4e70d13294f0302e9d5a7587e91129a26819cf16abb25e61e82aba89431db9"} Apr 16 22:16:12.990980 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:12.990940 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" podUID="b090c0b3-373b-4083-99b5-0851f1e3c94b" Apr 16 22:16:12.998074 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:12.998043 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wcg8z" podUID="011ad47b-a64a-4697-8f37-02cbc931d548" Apr 16 22:16:13.004254 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:13.004226 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-gv9kf" podUID="9b954b07-4736-4bf5-a073-457f98c06525" Apr 16 22:16:13.118368 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:13.118318 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:16:13.118368 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:13.118341 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" event={"ID":"29cdf665-3a36-4a09-a77b-299bff99a6ac","Type":"ContainerStarted","Data":"7933148a50d46e1375444367f401b3c108202a47e812ef629172646f87edcecf"} Apr 16 22:16:13.118578 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:13.118382 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" event={"ID":"29cdf665-3a36-4a09-a77b-299bff99a6ac","Type":"ContainerStarted","Data":"f2279380bf3af7eeed608ccb203d79a119319aa60c8b7ff72e81d8d0a10d3acb"} Apr 16 22:16:13.118578 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:13.118384 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:16:13.118578 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:13.118391 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wcg8z" Apr 16 22:16:13.135201 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:13.135147 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6jtg5" podStartSLOduration=17.269951271 podStartE2EDuration="19.135130912s" podCreationTimestamp="2026-04-16 22:15:54 +0000 UTC" firstStartedPulling="2026-04-16 22:16:10.745498018 +0000 UTC m=+153.676822861" lastFinishedPulling="2026-04-16 22:16:12.610677647 +0000 UTC m=+155.542002502" observedRunningTime="2026-04-16 22:16:13.13429928 +0000 UTC m=+156.065624143" watchObservedRunningTime="2026-04-16 22:16:13.135130912 +0000 UTC m=+156.066455775" Apr 16 22:16:13.684469 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:13.684435 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-2f4gk" podUID="e24f7f3c-00b2-43d5-9a49-1b7ee75125a1" Apr 16 22:16:17.920640 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:17.920608 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:16:17.921128 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:17.920687 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:16:17.921128 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:17.920723 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:16:17.922925 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:17.922902 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/011ad47b-a64a-4697-8f37-02cbc931d548-metrics-tls\") pod \"dns-default-wcg8z\" (UID: \"011ad47b-a64a-4697-8f37-02cbc931d548\") " pod="openshift-dns/dns-default-wcg8z" Apr 16 22:16:17.923015 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:17.922952 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls\") pod \"image-registry-76bb79884b-57jt7\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:16:17.923093 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:17.923074 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b954b07-4736-4bf5-a073-457f98c06525-cert\") pod \"ingress-canary-gv9kf\" (UID: \"9b954b07-4736-4bf5-a073-457f98c06525\") " pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:16:18.222953 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:18.222872 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fk2j6\"" Apr 16 22:16:18.222953 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:18.222911 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-86pmr\"" Apr 16 22:16:18.223129 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:18.222868 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-48hdh\"" Apr 16 22:16:18.229791 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:18.229767 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:16:18.229893 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:18.229798 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wcg8z" Apr 16 22:16:18.229893 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:18.229771 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gv9kf" Apr 16 22:16:18.385696 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:18.385665 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gv9kf"] Apr 16 22:16:18.388502 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:18.388468 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b954b07_4736_4bf5_a073_457f98c06525.slice/crio-6e625339e67ff92c8a77c7d3e7820cc55c1770380d673787e7ba6c8b0fefdaf7 WatchSource:0}: Error finding container 6e625339e67ff92c8a77c7d3e7820cc55c1770380d673787e7ba6c8b0fefdaf7: Status 404 returned error can't find the container with id 6e625339e67ff92c8a77c7d3e7820cc55c1770380d673787e7ba6c8b0fefdaf7 Apr 16 22:16:18.609230 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:18.609196 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wcg8z"] Apr 16 22:16:18.612486 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:18.612453 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod011ad47b_a64a_4697_8f37_02cbc931d548.slice/crio-1bdedf9b33c1d794924c49a989eaeebdd59b498adfaf912d2c1dd0b1e7fe81d9 WatchSource:0}: Error finding container 1bdedf9b33c1d794924c49a989eaeebdd59b498adfaf912d2c1dd0b1e7fe81d9: Status 404 returned error can't find the container with id 1bdedf9b33c1d794924c49a989eaeebdd59b498adfaf912d2c1dd0b1e7fe81d9 Apr 16 22:16:18.612486 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:18.612472 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76bb79884b-57jt7"] Apr 16 22:16:18.619715 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:18.619687 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb090c0b3_373b_4083_99b5_0851f1e3c94b.slice/crio-9ee558f23ff2236190c64e52d543f6996e84f7a856f9a1bd0e2f4ab804a6ecae WatchSource:0}: Error finding container 9ee558f23ff2236190c64e52d543f6996e84f7a856f9a1bd0e2f4ab804a6ecae: Status 404 returned error can't find the container with id 9ee558f23ff2236190c64e52d543f6996e84f7a856f9a1bd0e2f4ab804a6ecae Apr 16 22:16:19.134759 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:19.134720 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gv9kf" event={"ID":"9b954b07-4736-4bf5-a073-457f98c06525","Type":"ContainerStarted","Data":"6e625339e67ff92c8a77c7d3e7820cc55c1770380d673787e7ba6c8b0fefdaf7"} Apr 16 22:16:19.135904 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:19.135877 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wcg8z" event={"ID":"011ad47b-a64a-4697-8f37-02cbc931d548","Type":"ContainerStarted","Data":"1bdedf9b33c1d794924c49a989eaeebdd59b498adfaf912d2c1dd0b1e7fe81d9"} Apr 16 22:16:19.137295 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:19.137270 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" event={"ID":"b090c0b3-373b-4083-99b5-0851f1e3c94b","Type":"ContainerStarted","Data":"e80b98d2db3bc53956c6222ffbb7c0ad2850631a54e00c2f88e21a7f4c93bc8a"} Apr 16 22:16:19.137393 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:19.137300 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" event={"ID":"b090c0b3-373b-4083-99b5-0851f1e3c94b","Type":"ContainerStarted","Data":"9ee558f23ff2236190c64e52d543f6996e84f7a856f9a1bd0e2f4ab804a6ecae"} Apr 16 22:16:19.137393 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:19.137333 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:16:20.675283 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:20.675200 2571 scope.go:117] "RemoveContainer" containerID="54ea1a8e540b08455924788e5b0a90d974f084e7f689310d615302093f6910eb" Apr 16 22:16:21.143723 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:21.143695 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/1.log" Apr 16 22:16:21.143895 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:21.143814 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" event={"ID":"c1491aea-f867-4bd4-ab58-776381aad953","Type":"ContainerStarted","Data":"20f337d4705e85c9beaa220b2031477e74ae7e7b3cb91c0366bb4aab8a7e38ea"} Apr 16 22:16:21.144128 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:21.144107 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:16:21.145040 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:21.145017 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gv9kf" event={"ID":"9b954b07-4736-4bf5-a073-457f98c06525","Type":"ContainerStarted","Data":"7dd6bfddf4e4c02f4639627f6d4813bb7f38e61139f45e7e29e8d4d0b1840eeb"} Apr 16 22:16:21.146578 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:21.146536 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wcg8z" event={"ID":"011ad47b-a64a-4697-8f37-02cbc931d548","Type":"ContainerStarted","Data":"83461da7631434e548f7a819f7ce395abb995eb14270cb8d7c48d1649c26ce2f"} Apr 16 22:16:21.146660 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:21.146581 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wcg8z" event={"ID":"011ad47b-a64a-4697-8f37-02cbc931d548","Type":"ContainerStarted","Data":"d98c5b5096e049d79fddce6a9c69a66739328d4b67068a18283e2f316457cba8"} Apr 16 22:16:21.146714 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:21.146689 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wcg8z" Apr 16 22:16:21.148992 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:21.148976 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" Apr 16 22:16:21.161863 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:21.161827 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" podStartSLOduration=163.161816758 podStartE2EDuration="2m43.161816758s" podCreationTimestamp="2026-04-16 22:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:19.15941716 +0000 UTC m=+162.090742021" watchObservedRunningTime="2026-04-16 22:16:21.161816758 +0000 UTC m=+164.093141620" Apr 16 22:16:21.162234 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:21.162212 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-ktkhc" podStartSLOduration=24.862274443 podStartE2EDuration="27.162206654s" podCreationTimestamp="2026-04-16 22:15:54 +0000 UTC" firstStartedPulling="2026-04-16 22:15:55.108574628 +0000 UTC m=+138.039899471" lastFinishedPulling="2026-04-16 22:15:57.40850684 +0000 UTC m=+140.339831682" observedRunningTime="2026-04-16 22:16:21.16081687 +0000 UTC m=+164.092141729" watchObservedRunningTime="2026-04-16 22:16:21.162206654 +0000 UTC m=+164.093531515" Apr 16 22:16:21.176752 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:21.176697 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wcg8z" podStartSLOduration=130.460807196 podStartE2EDuration="2m12.176687432s" podCreationTimestamp="2026-04-16 22:14:09 +0000 UTC" firstStartedPulling="2026-04-16 22:16:18.614600938 +0000 UTC m=+161.545925791" lastFinishedPulling="2026-04-16 22:16:20.330481145 +0000 UTC m=+163.261806027" observedRunningTime="2026-04-16 22:16:21.175930155 +0000 UTC m=+164.107255017" watchObservedRunningTime="2026-04-16 22:16:21.176687432 +0000 UTC m=+164.108012293" Apr 16 22:16:21.190301 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:21.190265 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gv9kf" podStartSLOduration=130.252656952 podStartE2EDuration="2m12.19025595s" podCreationTimestamp="2026-04-16 22:14:09 +0000 UTC" firstStartedPulling="2026-04-16 22:16:18.391156878 +0000 UTC m=+161.322481718" lastFinishedPulling="2026-04-16 22:16:20.328755864 +0000 UTC m=+163.260080716" observedRunningTime="2026-04-16 22:16:21.18933723 +0000 UTC m=+164.120662092" watchObservedRunningTime="2026-04-16 22:16:21.19025595 +0000 UTC m=+164.121580845" Apr 16 22:16:22.105567 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.105519 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-lmj7s"] Apr 16 22:16:22.108939 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.108919 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.111498 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.111476 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-k7nbm\"" Apr 16 22:16:22.112430 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.112409 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:16:22.112578 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.112452 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:16:22.112578 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.112456 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:16:22.112578 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.112502 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:16:22.118996 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.118976 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lmj7s"] Apr 16 22:16:22.148781 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.148729 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-spq2q"] Apr 16 22:16:22.151935 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.151903 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-spq2q" Apr 16 22:16:22.154167 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.154148 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:16:22.154496 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.154478 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:16:22.154606 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.154583 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-9m5zr\"" Apr 16 22:16:22.163978 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.163959 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-spq2q"] Apr 16 22:16:22.256085 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.256056 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqqhm\" (UniqueName: \"kubernetes.io/projected/240bb924-5548-44ac-aab5-ffc41dca2bf6-kube-api-access-cqqhm\") pod \"downloads-6bcc868b7-spq2q\" (UID: \"240bb924-5548-44ac-aab5-ffc41dca2bf6\") " pod="openshift-console/downloads-6bcc868b7-spq2q" Apr 16 22:16:22.256282 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.256170 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f55d1b2b-7070-4ea9-a110-9789e41cd235-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.256351 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.256274 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f55d1b2b-7070-4ea9-a110-9789e41cd235-data-volume\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.256351 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.256314 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f55d1b2b-7070-4ea9-a110-9789e41cd235-crio-socket\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.256351 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.256343 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f55d1b2b-7070-4ea9-a110-9789e41cd235-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.256519 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.256450 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdprb\" (UniqueName: \"kubernetes.io/projected/f55d1b2b-7070-4ea9-a110-9789e41cd235-kube-api-access-gdprb\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.357051 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.356973 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f55d1b2b-7070-4ea9-a110-9789e41cd235-data-volume\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.357051 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.357008 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f55d1b2b-7070-4ea9-a110-9789e41cd235-crio-socket\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.357051 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.357024 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f55d1b2b-7070-4ea9-a110-9789e41cd235-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.357316 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.357071 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdprb\" (UniqueName: \"kubernetes.io/projected/f55d1b2b-7070-4ea9-a110-9789e41cd235-kube-api-access-gdprb\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.357316 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.357098 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqqhm\" (UniqueName: \"kubernetes.io/projected/240bb924-5548-44ac-aab5-ffc41dca2bf6-kube-api-access-cqqhm\") pod \"downloads-6bcc868b7-spq2q\" (UID: \"240bb924-5548-44ac-aab5-ffc41dca2bf6\") " pod="openshift-console/downloads-6bcc868b7-spq2q" Apr 16 22:16:22.357316 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.357111 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f55d1b2b-7070-4ea9-a110-9789e41cd235-crio-socket\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.357316 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.357123 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f55d1b2b-7070-4ea9-a110-9789e41cd235-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.357538 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.357368 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f55d1b2b-7070-4ea9-a110-9789e41cd235-data-volume\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.357624 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.357604 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f55d1b2b-7070-4ea9-a110-9789e41cd235-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.359521 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.359499 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f55d1b2b-7070-4ea9-a110-9789e41cd235-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.371649 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.371623 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqqhm\" (UniqueName: \"kubernetes.io/projected/240bb924-5548-44ac-aab5-ffc41dca2bf6-kube-api-access-cqqhm\") pod \"downloads-6bcc868b7-spq2q\" (UID: \"240bb924-5548-44ac-aab5-ffc41dca2bf6\") " pod="openshift-console/downloads-6bcc868b7-spq2q" Apr 16 22:16:22.372041 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.372021 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdprb\" (UniqueName: \"kubernetes.io/projected/f55d1b2b-7070-4ea9-a110-9789e41cd235-kube-api-access-gdprb\") pod \"insights-runtime-extractor-lmj7s\" (UID: \"f55d1b2b-7070-4ea9-a110-9789e41cd235\") " pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.418454 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.418426 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lmj7s" Apr 16 22:16:22.460506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.460421 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-spq2q" Apr 16 22:16:22.553404 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.552808 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lmj7s"] Apr 16 22:16:22.557141 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:22.557114 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf55d1b2b_7070_4ea9_a110_9789e41cd235.slice/crio-5aeddab4fabd0f30e9e1f9ef15c417173ab0c38a6414c0437b69e0755d56aea6 WatchSource:0}: Error finding container 5aeddab4fabd0f30e9e1f9ef15c417173ab0c38a6414c0437b69e0755d56aea6: Status 404 returned error can't find the container with id 5aeddab4fabd0f30e9e1f9ef15c417173ab0c38a6414c0437b69e0755d56aea6 Apr 16 22:16:22.598292 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:22.598267 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-spq2q"] Apr 16 22:16:22.600114 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:22.600090 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod240bb924_5548_44ac_aab5_ffc41dca2bf6.slice/crio-b5ba81a9ad18b3bfbd7d7756e9fe1c22ce218bc6b001f1b5c1ec95d5e5382755 WatchSource:0}: Error finding container b5ba81a9ad18b3bfbd7d7756e9fe1c22ce218bc6b001f1b5c1ec95d5e5382755: Status 404 returned error can't find the container with id b5ba81a9ad18b3bfbd7d7756e9fe1c22ce218bc6b001f1b5c1ec95d5e5382755 Apr 16 22:16:23.154891 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:23.154848 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lmj7s" event={"ID":"f55d1b2b-7070-4ea9-a110-9789e41cd235","Type":"ContainerStarted","Data":"0c43586da70ef7b8e888f024128536f50814e24b1aa4556dffe299ac546fcad2"} Apr 16 22:16:23.154891 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:23.154896 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lmj7s" event={"ID":"f55d1b2b-7070-4ea9-a110-9789e41cd235","Type":"ContainerStarted","Data":"5aeddab4fabd0f30e9e1f9ef15c417173ab0c38a6414c0437b69e0755d56aea6"} Apr 16 22:16:23.155948 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:23.155911 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-spq2q" event={"ID":"240bb924-5548-44ac-aab5-ffc41dca2bf6","Type":"ContainerStarted","Data":"b5ba81a9ad18b3bfbd7d7756e9fe1c22ce218bc6b001f1b5c1ec95d5e5382755"} Apr 16 22:16:24.160586 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:24.160513 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lmj7s" event={"ID":"f55d1b2b-7070-4ea9-a110-9789e41cd235","Type":"ContainerStarted","Data":"7925c19f29cc2cc5da0cc5fe01bc0d0f0a8ea1f50f8e2faf47d3377ce9b617b8"} Apr 16 22:16:25.165711 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:25.165659 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lmj7s" event={"ID":"f55d1b2b-7070-4ea9-a110-9789e41cd235","Type":"ContainerStarted","Data":"1055259ea505783b701468dc6e2c1e0a67fe73d1b6bb41fbe64cb76b24ac8a41"} Apr 16 22:16:25.192731 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:25.192675 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-lmj7s" podStartSLOduration=0.830748626 podStartE2EDuration="3.192657402s" podCreationTimestamp="2026-04-16 22:16:22 +0000 UTC" firstStartedPulling="2026-04-16 22:16:22.610718866 +0000 UTC m=+165.542043706" lastFinishedPulling="2026-04-16 22:16:24.972627631 +0000 UTC m=+167.903952482" observedRunningTime="2026-04-16 22:16:25.191901751 +0000 UTC m=+168.123226625" watchObservedRunningTime="2026-04-16 22:16:25.192657402 +0000 UTC m=+168.123982269" Apr 16 22:16:26.674947 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:26.674845 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:16:26.691044 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:26.691014 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:16:26.693640 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:26.693620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fbcc451-603b-46ab-be67-fab197b0645c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kr46t\" (UID: \"8fbcc451-603b-46ab-be67-fab197b0645c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:16:26.892201 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:26.892163 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" Apr 16 22:16:27.023525 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:27.023491 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t"] Apr 16 22:16:27.026415 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:27.026384 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fbcc451_603b_46ab_be67_fab197b0645c.slice/crio-06ad3d23674a1def722b1c0effaf802ad83b952655725a325e564e7bdc594732 WatchSource:0}: Error finding container 06ad3d23674a1def722b1c0effaf802ad83b952655725a325e564e7bdc594732: Status 404 returned error can't find the container with id 06ad3d23674a1def722b1c0effaf802ad83b952655725a325e564e7bdc594732 Apr 16 22:16:27.172872 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:27.172841 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" event={"ID":"8fbcc451-603b-46ab-be67-fab197b0645c","Type":"ContainerStarted","Data":"06ad3d23674a1def722b1c0effaf802ad83b952655725a325e564e7bdc594732"} Apr 16 22:16:29.180317 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:29.180263 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" event={"ID":"8fbcc451-603b-46ab-be67-fab197b0645c","Type":"ContainerStarted","Data":"985cd25b3ab7db0325ed156c3509484a77faa01a2cefcbea73ee4ca7f526855c"} Apr 16 22:16:29.204458 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:29.204403 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kr46t" podStartSLOduration=33.515867426 podStartE2EDuration="35.204387549s" podCreationTimestamp="2026-04-16 22:15:54 +0000 UTC" firstStartedPulling="2026-04-16 22:16:27.028535682 +0000 UTC m=+169.959860525" lastFinishedPulling="2026-04-16 22:16:28.717055794 +0000 UTC m=+171.648380648" observedRunningTime="2026-04-16 22:16:29.202194179 +0000 UTC m=+172.133519041" watchObservedRunningTime="2026-04-16 22:16:29.204387549 +0000 UTC m=+172.135712410" Apr 16 22:16:29.275374 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:29.275342 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb"] Apr 16 22:16:29.278677 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:29.278651 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb" Apr 16 22:16:29.281516 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:29.281488 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-pmmqp\"" Apr 16 22:16:29.281649 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:29.281529 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 22:16:29.291076 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:29.291050 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb"] Apr 16 22:16:29.415660 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:29.415613 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6b034329-a956-4a82-8545-11a6b8cfddd3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-95ddb\" (UID: \"6b034329-a956-4a82-8545-11a6b8cfddd3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb" Apr 16 22:16:29.516503 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:29.516467 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6b034329-a956-4a82-8545-11a6b8cfddd3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-95ddb\" (UID: \"6b034329-a956-4a82-8545-11a6b8cfddd3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb" Apr 16 22:16:29.516702 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:29.516646 2571 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 22:16:29.516763 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:29.516728 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b034329-a956-4a82-8545-11a6b8cfddd3-tls-certificates podName:6b034329-a956-4a82-8545-11a6b8cfddd3 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:30.016706951 +0000 UTC m=+172.948031796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/6b034329-a956-4a82-8545-11a6b8cfddd3-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-95ddb" (UID: "6b034329-a956-4a82-8545-11a6b8cfddd3") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 22:16:30.021182 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:30.021150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6b034329-a956-4a82-8545-11a6b8cfddd3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-95ddb\" (UID: \"6b034329-a956-4a82-8545-11a6b8cfddd3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb" Apr 16 22:16:30.023987 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:30.023957 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6b034329-a956-4a82-8545-11a6b8cfddd3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-95ddb\" (UID: \"6b034329-a956-4a82-8545-11a6b8cfddd3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb" Apr 16 22:16:30.190811 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:30.190782 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb" Apr 16 22:16:30.319629 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:30.319583 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb"] Apr 16 22:16:30.322396 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:30.322367 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b034329_a956_4a82_8545_11a6b8cfddd3.slice/crio-15ce139bc7e35d15f35f683b1eae9e9df01d5e7f6ebd55cc2f7e0963b45d86f8 WatchSource:0}: Error finding container 15ce139bc7e35d15f35f683b1eae9e9df01d5e7f6ebd55cc2f7e0963b45d86f8: Status 404 returned error can't find the container with id 15ce139bc7e35d15f35f683b1eae9e9df01d5e7f6ebd55cc2f7e0963b45d86f8 Apr 16 22:16:31.153936 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:31.153908 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wcg8z" Apr 16 22:16:31.188108 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:31.188071 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb" event={"ID":"6b034329-a956-4a82-8545-11a6b8cfddd3","Type":"ContainerStarted","Data":"15ce139bc7e35d15f35f683b1eae9e9df01d5e7f6ebd55cc2f7e0963b45d86f8"} Apr 16 22:16:32.192219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:32.192138 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb" event={"ID":"6b034329-a956-4a82-8545-11a6b8cfddd3","Type":"ContainerStarted","Data":"6f68a46cfd9e955445519005469cd5916b238b1cd8855998d10673e108760d2b"} Apr 16 22:16:32.192666 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:32.192400 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb" Apr 16 22:16:32.197723 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:32.197664 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb" Apr 16 22:16:32.209648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:32.209601 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-95ddb" podStartSLOduration=1.706874305 podStartE2EDuration="3.209584636s" podCreationTimestamp="2026-04-16 22:16:29 +0000 UTC" firstStartedPulling="2026-04-16 22:16:30.324679508 +0000 UTC m=+173.256004352" lastFinishedPulling="2026-04-16 22:16:31.827389832 +0000 UTC m=+174.758714683" observedRunningTime="2026-04-16 22:16:32.209400358 +0000 UTC m=+175.140725221" watchObservedRunningTime="2026-04-16 22:16:32.209584636 +0000 UTC m=+175.140909497" Apr 16 22:16:37.687500 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.687405 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8"] Apr 16 22:16:37.693029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.692379 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:37.696053 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.696034 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8"] Apr 16 22:16:37.696312 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.696297 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 22:16:37.696443 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.696297 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-92gn6\"" Apr 16 22:16:37.696585 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.696345 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 22:16:37.697306 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.697291 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:16:37.723042 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.723017 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lwbcr"] Apr 16 22:16:37.727048 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.727025 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.729853 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.729827 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pvvh7\"" Apr 16 22:16:37.730245 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.729957 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:16:37.730245 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.730179 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:16:37.730576 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.730510 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:16:37.789910 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.789869 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4672\" (UniqueName: \"kubernetes.io/projected/2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b-kube-api-access-v4672\") pod \"openshift-state-metrics-9d44df66c-hrwj8\" (UID: \"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:37.790082 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.789927 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hrwj8\" (UID: \"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:37.790082 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.789965 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hrwj8\" (UID: \"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:37.790082 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.790035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hrwj8\" (UID: \"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:37.891286 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.891248 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-tls\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.891286 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.891288 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-textfile\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.891501 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.891331 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hrwj8\" (UID: \"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:37.891501 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.891349 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.891501 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.891472 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e414258-8f48-4834-bd1d-b4fc85272d4f-metrics-client-ca\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.891697 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.891515 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4672\" (UniqueName: \"kubernetes.io/projected/2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b-kube-api-access-v4672\") pod \"openshift-state-metrics-9d44df66c-hrwj8\" (UID: \"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:37.891697 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.891591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hrwj8\" (UID: \"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:37.891697 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.891637 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-accelerators-collector-config\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.891697 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.891672 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hrwj8\" (UID: \"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:37.891907 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.891698 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2n2s\" (UniqueName: \"kubernetes.io/projected/9e414258-8f48-4834-bd1d-b4fc85272d4f-kube-api-access-r2n2s\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.891907 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.891736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9e414258-8f48-4834-bd1d-b4fc85272d4f-sys\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.891907 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.891766 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-wtmp\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.891907 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.891791 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9e414258-8f48-4834-bd1d-b4fc85272d4f-root\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.892487 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.892463 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hrwj8\" (UID: \"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:37.894711 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.894659 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hrwj8\" (UID: \"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:37.895076 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.895057 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hrwj8\" (UID: \"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:37.904358 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.904313 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4672\" (UniqueName: \"kubernetes.io/projected/2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b-kube-api-access-v4672\") pod \"openshift-state-metrics-9d44df66c-hrwj8\" (UID: \"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:37.992226 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.992190 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-accelerators-collector-config\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.992504 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.992481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2n2s\" (UniqueName: \"kubernetes.io/projected/9e414258-8f48-4834-bd1d-b4fc85272d4f-kube-api-access-r2n2s\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.993037 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.992966 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9e414258-8f48-4834-bd1d-b4fc85272d4f-sys\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.993037 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.992902 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-accelerators-collector-config\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.993037 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.993027 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9e414258-8f48-4834-bd1d-b4fc85272d4f-sys\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.993243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.993061 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-wtmp\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.993243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.993088 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9e414258-8f48-4834-bd1d-b4fc85272d4f-root\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.993243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.993128 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-tls\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.993243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.993160 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-textfile\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.993243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.993205 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.993467 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.993250 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e414258-8f48-4834-bd1d-b4fc85272d4f-metrics-client-ca\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.993664 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:37.993640 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:16:37.993785 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:37.993720 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-tls podName:9e414258-8f48-4834-bd1d-b4fc85272d4f nodeName:}" failed. No retries permitted until 2026-04-16 22:16:38.49370074 +0000 UTC m=+181.425025601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-tls") pod "node-exporter-lwbcr" (UID: "9e414258-8f48-4834-bd1d-b4fc85272d4f") : secret "node-exporter-tls" not found Apr 16 22:16:37.993785 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.993775 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9e414258-8f48-4834-bd1d-b4fc85272d4f-root\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.993915 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.993899 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-wtmp\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.994195 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.994175 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-textfile\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.994458 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.994436 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e414258-8f48-4834-bd1d-b4fc85272d4f-metrics-client-ca\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:37.996322 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:37.996282 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:38.004898 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.004858 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" Apr 16 22:16:38.005653 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.005632 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2n2s\" (UniqueName: \"kubernetes.io/projected/9e414258-8f48-4834-bd1d-b4fc85272d4f-kube-api-access-r2n2s\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:38.235801 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.235759 2571 patch_prober.go:28] interesting pod/image-registry-76bb79884b-57jt7 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:16:38.235978 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.235827 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" podUID="b090c0b3-373b-4083-99b5-0851f1e3c94b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:16:38.499236 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.499179 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-tls\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:38.499426 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:38.499329 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:16:38.499426 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:38.499423 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-tls podName:9e414258-8f48-4834-bd1d-b4fc85272d4f nodeName:}" failed. No retries permitted until 2026-04-16 22:16:39.499400884 +0000 UTC m=+182.430725732 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-tls") pod "node-exporter-lwbcr" (UID: "9e414258-8f48-4834-bd1d-b4fc85272d4f") : secret "node-exporter-tls" not found Apr 16 22:16:38.582626 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.582540 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bbb95b95-29q9s"] Apr 16 22:16:38.587412 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.587388 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.590049 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.589988 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:16:38.590176 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.590150 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:16:38.590229 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.590189 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:16:38.590291 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.590257 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7vb78\"" Apr 16 22:16:38.590343 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.590333 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:16:38.590572 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.590529 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:16:38.595436 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.595406 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bbb95b95-29q9s"] Apr 16 22:16:38.598245 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.598213 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:16:38.700921 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.700879 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-serving-cert\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.701373 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.700943 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-config\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.701373 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.700966 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-trusted-ca-bundle\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.701373 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.700982 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzqsh\" (UniqueName: \"kubernetes.io/projected/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-kube-api-access-pzqsh\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.701373 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.701016 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-service-ca\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.701373 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.701099 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-oauth-serving-cert\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.701373 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.701160 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-oauth-config\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.775905 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.775825 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:16:38.780656 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.780628 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:38.784510 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.784485 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 22:16:38.784745 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.784724 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 22:16:38.784981 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.784966 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-9ph9r\"" Apr 16 22:16:38.785157 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.785137 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 22:16:38.785341 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.785325 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 22:16:38.785545 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.785524 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 22:16:38.785849 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.785832 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 22:16:38.786045 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.786023 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 22:16:38.786166 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.786145 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 22:16:38.786466 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.786449 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 22:16:38.802372 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.797956 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:16:38.804409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.802826 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzqsh\" (UniqueName: \"kubernetes.io/projected/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-kube-api-access-pzqsh\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.804409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.802913 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-service-ca\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.804409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.803000 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-oauth-serving-cert\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.804409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.803066 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-oauth-config\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.804409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.803171 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-serving-cert\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.804409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.803227 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-config\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.804409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.803262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-trusted-ca-bundle\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.804409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.804251 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-trusted-ca-bundle\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.804898 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.804663 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-service-ca\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.805405 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.805379 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-oauth-serving-cert\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.805621 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.805524 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-config\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.809334 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.809262 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-serving-cert\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.809431 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.809378 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-oauth-config\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.822461 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.822435 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzqsh\" (UniqueName: \"kubernetes.io/projected/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-kube-api-access-pzqsh\") pod \"console-6bbb95b95-29q9s\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.902194 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.902160 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:38.904192 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.904162 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-web-config\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:38.904310 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.904206 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:38.904310 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.904278 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:38.904418 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.904318 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcz9v\" (UniqueName: \"kubernetes.io/projected/d2aef76a-b4c3-442a-ac34-144815b90018-kube-api-access-jcz9v\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:38.904418 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.904381 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:38.904526 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.904421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:38.904526 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.904451 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d2aef76a-b4c3-442a-ac34-144815b90018-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:38.904526 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.904485 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:38.904809 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.904572 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:38.904809 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.904619 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:38.904809 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.904642 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-config-volume\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:38.904809 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.904678 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:38.904809 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:38.904723 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d2aef76a-b4c3-442a-ac34-144815b90018-config-out\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.005826 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.005791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.005982 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.005849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcz9v\" (UniqueName: \"kubernetes.io/projected/d2aef76a-b4c3-442a-ac34-144815b90018-kube-api-access-jcz9v\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.005982 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.005885 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.005982 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.005908 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.005982 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.005928 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d2aef76a-b4c3-442a-ac34-144815b90018-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.005982 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.005954 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.006210 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.006007 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.006210 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.006060 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.006210 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:39.006072 2571 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 22:16:39.006210 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.006088 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-config-volume\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.006210 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:39.006136 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-main-tls podName:d2aef76a-b4c3-442a-ac34-144815b90018 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:39.506116911 +0000 UTC m=+182.437441765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018") : secret "alertmanager-main-tls" not found Apr 16 22:16:39.006210 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.006155 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.006210 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.006206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d2aef76a-b4c3-442a-ac34-144815b90018-config-out\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.006422 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.006244 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-web-config\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.006422 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:39.006273 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-trusted-ca-bundle podName:d2aef76a-b4c3-442a-ac34-144815b90018 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:39.506252519 +0000 UTC m=+182.437577371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018") : configmap references non-existent config key: ca-bundle.crt Apr 16 22:16:39.006422 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.006311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.008284 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.006848 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.008284 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.008013 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.009359 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.009174 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d2aef76a-b4c3-442a-ac34-144815b90018-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.009359 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.009323 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-web-config\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.009359 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.009323 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.009843 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.009815 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-config-volume\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.010428 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.010407 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.011377 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.011358 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d2aef76a-b4c3-442a-ac34-144815b90018-config-out\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.011676 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.011654 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.011949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.011929 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.021949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.021889 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcz9v\" (UniqueName: \"kubernetes.io/projected/d2aef76a-b4c3-442a-ac34-144815b90018-kube-api-access-jcz9v\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.184241 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.184188 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8"] Apr 16 22:16:39.187503 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:39.187469 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2faeab6a_a0ee_4f0e_a6f3_b39f1a09e72b.slice/crio-943947e56c797c5402fc6eaf8617264db04c7adfb5a8a68788613acab2cebe67 WatchSource:0}: Error finding container 943947e56c797c5402fc6eaf8617264db04c7adfb5a8a68788613acab2cebe67: Status 404 returned error can't find the container with id 943947e56c797c5402fc6eaf8617264db04c7adfb5a8a68788613acab2cebe67 Apr 16 22:16:39.198595 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.198570 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bbb95b95-29q9s"] Apr 16 22:16:39.203431 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:39.203390 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3019bfd0_c68c_419b_8cf7_0fb50f0c5a37.slice/crio-598019de236b1fbec9232f96155d6708a93da0c32ab7cbd0ae2ba49628491041 WatchSource:0}: Error finding container 598019de236b1fbec9232f96155d6708a93da0c32ab7cbd0ae2ba49628491041: Status 404 returned error can't find the container with id 598019de236b1fbec9232f96155d6708a93da0c32ab7cbd0ae2ba49628491041 Apr 16 22:16:39.217190 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.217161 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bbb95b95-29q9s" event={"ID":"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37","Type":"ContainerStarted","Data":"598019de236b1fbec9232f96155d6708a93da0c32ab7cbd0ae2ba49628491041"} Apr 16 22:16:39.219592 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.219570 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" event={"ID":"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b","Type":"ContainerStarted","Data":"943947e56c797c5402fc6eaf8617264db04c7adfb5a8a68788613acab2cebe67"} Apr 16 22:16:39.510985 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.510949 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-tls\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:39.511180 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.511066 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.511180 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.511104 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.512149 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.512097 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.513925 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.513899 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9e414258-8f48-4834-bd1d-b4fc85272d4f-node-exporter-tls\") pod \"node-exporter-lwbcr\" (UID: \"9e414258-8f48-4834-bd1d-b4fc85272d4f\") " pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:39.514036 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.513959 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.540893 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.540812 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lwbcr" Apr 16 22:16:39.550856 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:39.550829 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e414258_8f48_4834_bd1d_b4fc85272d4f.slice/crio-a2ed2a4d43a36beb007abb99de4f8fdfe16acb90be7d018e45d3a31e4bbd102a WatchSource:0}: Error finding container a2ed2a4d43a36beb007abb99de4f8fdfe16acb90be7d018e45d3a31e4bbd102a: Status 404 returned error can't find the container with id a2ed2a4d43a36beb007abb99de4f8fdfe16acb90be7d018e45d3a31e4bbd102a Apr 16 22:16:39.697078 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.697033 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:39.906383 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:39.906322 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:16:39.909300 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:39.909268 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2aef76a_b4c3_442a_ac34_144815b90018.slice/crio-2d0f70a0b0bc1e963b2269f15b5bef4d2c794b7a04886f707be91d0465e709a5 WatchSource:0}: Error finding container 2d0f70a0b0bc1e963b2269f15b5bef4d2c794b7a04886f707be91d0465e709a5: Status 404 returned error can't find the container with id 2d0f70a0b0bc1e963b2269f15b5bef4d2c794b7a04886f707be91d0465e709a5 Apr 16 22:16:40.149359 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:40.148229 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:16:40.230967 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:40.229258 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-spq2q" event={"ID":"240bb924-5548-44ac-aab5-ffc41dca2bf6","Type":"ContainerStarted","Data":"ded85d4450d0382a42d95c8130f91ec95802ef465c28d262d542be3ca9c2d3b6"} Apr 16 22:16:40.230967 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:40.230197 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-spq2q" Apr 16 22:16:40.233177 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:40.233109 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" event={"ID":"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b","Type":"ContainerStarted","Data":"66f9252f00e7b4f059ada9a6a58e4f3d11f1c95777fe2d906247d389eb848d57"} Apr 16 22:16:40.233177 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:40.233144 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" event={"ID":"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b","Type":"ContainerStarted","Data":"fee67eb35f3beac8dac1b29c5be060ad53af03b24a40b85a73d59ee6b0b50517"} Apr 16 22:16:40.237177 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:40.237133 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerStarted","Data":"2d0f70a0b0bc1e963b2269f15b5bef4d2c794b7a04886f707be91d0465e709a5"} Apr 16 22:16:40.239229 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:40.239194 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-spq2q" Apr 16 22:16:40.240439 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:40.240368 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lwbcr" event={"ID":"9e414258-8f48-4834-bd1d-b4fc85272d4f","Type":"ContainerStarted","Data":"a2ed2a4d43a36beb007abb99de4f8fdfe16acb90be7d018e45d3a31e4bbd102a"} Apr 16 22:16:40.261760 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:40.261277 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-spq2q" podStartSLOduration=1.7316662919999999 podStartE2EDuration="18.261256442s" podCreationTimestamp="2026-04-16 22:16:22 +0000 UTC" firstStartedPulling="2026-04-16 22:16:22.601953378 +0000 UTC m=+165.533278218" lastFinishedPulling="2026-04-16 22:16:39.131543513 +0000 UTC m=+182.062868368" observedRunningTime="2026-04-16 22:16:40.258935335 +0000 UTC m=+183.190260260" watchObservedRunningTime="2026-04-16 22:16:40.261256442 +0000 UTC m=+183.192581310" Apr 16 22:16:42.020385 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.020348 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5fdbdc7cff-g759t"] Apr 16 22:16:42.037612 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.037579 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fdbdc7cff-g759t"] Apr 16 22:16:42.037866 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.037703 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.041021 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.040837 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 22:16:42.041021 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.040915 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 22:16:42.041215 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.041192 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 22:16:42.041826 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.041702 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-clhh8\"" Apr 16 22:16:42.042062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.041920 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 22:16:42.042164 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.042127 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-a04vdsijq953p\"" Apr 16 22:16:42.144825 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.144779 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9e97edd7-009d-4e18-9786-f18ece2787a7-audit-log\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.145080 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.144853 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8drbk\" (UniqueName: \"kubernetes.io/projected/9e97edd7-009d-4e18-9786-f18ece2787a7-kube-api-access-8drbk\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.145080 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.144920 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e97edd7-009d-4e18-9786-f18ece2787a7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.145080 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.144952 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9e97edd7-009d-4e18-9786-f18ece2787a7-metrics-server-audit-profiles\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.145080 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.145019 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e97edd7-009d-4e18-9786-f18ece2787a7-client-ca-bundle\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.145080 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.145041 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9e97edd7-009d-4e18-9786-f18ece2787a7-secret-metrics-server-client-certs\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.145080 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.145065 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9e97edd7-009d-4e18-9786-f18ece2787a7-secret-metrics-server-tls\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.245720 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.245675 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8drbk\" (UniqueName: \"kubernetes.io/projected/9e97edd7-009d-4e18-9786-f18ece2787a7-kube-api-access-8drbk\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.245936 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.245791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e97edd7-009d-4e18-9786-f18ece2787a7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.245936 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.245827 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9e97edd7-009d-4e18-9786-f18ece2787a7-metrics-server-audit-profiles\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.245936 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.245901 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e97edd7-009d-4e18-9786-f18ece2787a7-client-ca-bundle\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.245936 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.245934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9e97edd7-009d-4e18-9786-f18ece2787a7-secret-metrics-server-client-certs\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.246165 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.245962 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9e97edd7-009d-4e18-9786-f18ece2787a7-secret-metrics-server-tls\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.246312 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.246261 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9e97edd7-009d-4e18-9786-f18ece2787a7-audit-log\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.247483 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.247456 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e97edd7-009d-4e18-9786-f18ece2787a7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.247678 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.247634 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9e97edd7-009d-4e18-9786-f18ece2787a7-metrics-server-audit-profiles\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.247792 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.247770 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9e97edd7-009d-4e18-9786-f18ece2787a7-audit-log\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.250069 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.250041 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9e97edd7-009d-4e18-9786-f18ece2787a7-secret-metrics-server-client-certs\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.250604 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.250579 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9e97edd7-009d-4e18-9786-f18ece2787a7-secret-metrics-server-tls\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.254326 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.254285 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e97edd7-009d-4e18-9786-f18ece2787a7-client-ca-bundle\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.259287 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.259267 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8drbk\" (UniqueName: \"kubernetes.io/projected/9e97edd7-009d-4e18-9786-f18ece2787a7-kube-api-access-8drbk\") pod \"metrics-server-5fdbdc7cff-g759t\" (UID: \"9e97edd7-009d-4e18-9786-f18ece2787a7\") " pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.352835 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.352710 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:16:42.487322 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.487288 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd"] Apr 16 22:16:42.523213 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.523164 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd"] Apr 16 22:16:42.523392 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.523339 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd" Apr 16 22:16:42.526052 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.526029 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 22:16:42.526522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.526376 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-wfwrt\"" Apr 16 22:16:42.550234 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.550186 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/48b33455-80b7-4a5e-9249-bdd9508d2074-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-46xrd\" (UID: \"48b33455-80b7-4a5e-9249-bdd9508d2074\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd" Apr 16 22:16:42.651248 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:42.651158 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/48b33455-80b7-4a5e-9249-bdd9508d2074-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-46xrd\" (UID: \"48b33455-80b7-4a5e-9249-bdd9508d2074\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd" Apr 16 22:16:42.651419 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:42.651353 2571 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 22:16:42.651470 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:16:42.651431 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48b33455-80b7-4a5e-9249-bdd9508d2074-monitoring-plugin-cert podName:48b33455-80b7-4a5e-9249-bdd9508d2074 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:43.151409406 +0000 UTC m=+186.082734251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/48b33455-80b7-4a5e-9249-bdd9508d2074-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-46xrd" (UID: "48b33455-80b7-4a5e-9249-bdd9508d2074") : secret "monitoring-plugin-cert" not found Apr 16 22:16:43.155704 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:43.155673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/48b33455-80b7-4a5e-9249-bdd9508d2074-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-46xrd\" (UID: \"48b33455-80b7-4a5e-9249-bdd9508d2074\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd" Apr 16 22:16:43.158399 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:43.158372 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/48b33455-80b7-4a5e-9249-bdd9508d2074-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-46xrd\" (UID: \"48b33455-80b7-4a5e-9249-bdd9508d2074\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd" Apr 16 22:16:43.437417 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:43.437388 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd" Apr 16 22:16:43.687261 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:43.687193 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fdbdc7cff-g759t"] Apr 16 22:16:43.699021 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:43.698980 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e97edd7_009d_4e18_9786_f18ece2787a7.slice/crio-311f5ad1e77246b7dc5b75516cf19b9f41f122b4c0eb62c91d2c438e205a70be WatchSource:0}: Error finding container 311f5ad1e77246b7dc5b75516cf19b9f41f122b4c0eb62c91d2c438e205a70be: Status 404 returned error can't find the container with id 311f5ad1e77246b7dc5b75516cf19b9f41f122b4c0eb62c91d2c438e205a70be Apr 16 22:16:43.715482 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:43.715310 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd"] Apr 16 22:16:43.746772 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:16:43.746732 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48b33455_80b7_4a5e_9249_bdd9508d2074.slice/crio-7fee327de14438d20cf9eab4be7f70386f67ec8545a4884955b76e7ebb512d4d WatchSource:0}: Error finding container 7fee327de14438d20cf9eab4be7f70386f67ec8545a4884955b76e7ebb512d4d: Status 404 returned error can't find the container with id 7fee327de14438d20cf9eab4be7f70386f67ec8545a4884955b76e7ebb512d4d Apr 16 22:16:44.257954 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.257915 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bbb95b95-29q9s" event={"ID":"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37","Type":"ContainerStarted","Data":"6fe3130e7cb3a1741007e5c0af79835d6b0d484467ac58878739dc0cd430b390"} Apr 16 22:16:44.261595 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.261227 2571 generic.go:358] "Generic (PLEG): container finished" podID="9e414258-8f48-4834-bd1d-b4fc85272d4f" containerID="53e7a46d22d0699c82aa7a09b4805c4fd24f8512aceea911dca0e98e1506da6a" exitCode=0 Apr 16 22:16:44.261595 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.261305 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lwbcr" event={"ID":"9e414258-8f48-4834-bd1d-b4fc85272d4f","Type":"ContainerDied","Data":"53e7a46d22d0699c82aa7a09b4805c4fd24f8512aceea911dca0e98e1506da6a"} Apr 16 22:16:44.263972 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.263913 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd" event={"ID":"48b33455-80b7-4a5e-9249-bdd9508d2074","Type":"ContainerStarted","Data":"7fee327de14438d20cf9eab4be7f70386f67ec8545a4884955b76e7ebb512d4d"} Apr 16 22:16:44.267971 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.267423 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" event={"ID":"2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b","Type":"ContainerStarted","Data":"b811b4b3f73a2b2df6e345d63678da2d3562da92e1b4c660a696e0be4949ad60"} Apr 16 22:16:44.271051 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.270394 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2aef76a-b4c3-442a-ac34-144815b90018" containerID="47f611f8a4ed6e0520d4c14e95e9f41b03bdeb4ec9805f8d505c1adffc5c223a" exitCode=0 Apr 16 22:16:44.271051 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.270459 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerDied","Data":"47f611f8a4ed6e0520d4c14e95e9f41b03bdeb4ec9805f8d505c1adffc5c223a"} Apr 16 22:16:44.273101 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.273074 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" event={"ID":"9e97edd7-009d-4e18-9786-f18ece2787a7","Type":"ContainerStarted","Data":"311f5ad1e77246b7dc5b75516cf19b9f41f122b4c0eb62c91d2c438e205a70be"} Apr 16 22:16:44.296966 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.296105 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bbb95b95-29q9s" podStartSLOduration=2.003759447 podStartE2EDuration="6.296090567s" podCreationTimestamp="2026-04-16 22:16:38 +0000 UTC" firstStartedPulling="2026-04-16 22:16:39.206051682 +0000 UTC m=+182.137376529" lastFinishedPulling="2026-04-16 22:16:43.498382795 +0000 UTC m=+186.429707649" observedRunningTime="2026-04-16 22:16:44.284186861 +0000 UTC m=+187.215511721" watchObservedRunningTime="2026-04-16 22:16:44.296090567 +0000 UTC m=+187.227415428" Apr 16 22:16:44.349339 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.349310 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-76bb79884b-57jt7"] Apr 16 22:16:44.395561 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.395499 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hrwj8" podStartSLOduration=5.196107484 podStartE2EDuration="7.395481893s" podCreationTimestamp="2026-04-16 22:16:37 +0000 UTC" firstStartedPulling="2026-04-16 22:16:39.319857863 +0000 UTC m=+182.251182710" lastFinishedPulling="2026-04-16 22:16:41.51923226 +0000 UTC m=+184.450557119" observedRunningTime="2026-04-16 22:16:44.394113687 +0000 UTC m=+187.325438549" watchObservedRunningTime="2026-04-16 22:16:44.395481893 +0000 UTC m=+187.326806759" Apr 16 22:16:44.455201 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.455169 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bbb95b95-29q9s"] Apr 16 22:16:44.524341 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.524309 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-57c9bfbccc-jp84d"] Apr 16 22:16:44.561395 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.561350 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57c9bfbccc-jp84d"] Apr 16 22:16:44.561608 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.561495 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.677815 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.677777 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-oauth-serving-cert\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.678010 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.677868 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-config\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.678010 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.677900 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkdrz\" (UniqueName: \"kubernetes.io/projected/626f6e30-77d2-4e87-a3b9-d4208540ca5a-kube-api-access-jkdrz\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.678010 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.677953 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-trusted-ca-bundle\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.678010 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.677996 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-oauth-config\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.678241 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.678055 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-service-ca\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.678241 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.678097 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-serving-cert\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.781019 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.779472 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-trusted-ca-bundle\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.781019 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.779522 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-oauth-config\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.781019 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.779606 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-service-ca\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.781019 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.779641 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-serving-cert\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.781019 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.779674 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-oauth-serving-cert\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.781019 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.779727 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-config\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.781019 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.779752 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkdrz\" (UniqueName: \"kubernetes.io/projected/626f6e30-77d2-4e87-a3b9-d4208540ca5a-kube-api-access-jkdrz\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.781640 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.781585 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-oauth-serving-cert\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.782502 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.782458 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-config\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.782619 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.782539 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-trusted-ca-bundle\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.783754 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.783730 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-service-ca\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.785520 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.785481 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-serving-cert\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.786805 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.786766 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-oauth-config\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.798410 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.798370 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkdrz\" (UniqueName: \"kubernetes.io/projected/626f6e30-77d2-4e87-a3b9-d4208540ca5a-kube-api-access-jkdrz\") pod \"console-57c9bfbccc-jp84d\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:44.876328 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:44.876268 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:16:45.284230 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:45.283496 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lwbcr" event={"ID":"9e414258-8f48-4834-bd1d-b4fc85272d4f","Type":"ContainerStarted","Data":"e207f3937aaf336b5ead219ea51af354812138839a1107019b60c6a7fe2a46aa"} Apr 16 22:16:45.284230 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:45.284196 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lwbcr" event={"ID":"9e414258-8f48-4834-bd1d-b4fc85272d4f","Type":"ContainerStarted","Data":"5dad4257288a1d3c548b7990fe20dce81f2e885627ab7a13b903bf506fddee40"} Apr 16 22:16:45.314156 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:45.313919 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lwbcr" podStartSLOduration=6.350531607 podStartE2EDuration="8.313899692s" podCreationTimestamp="2026-04-16 22:16:37 +0000 UTC" firstStartedPulling="2026-04-16 22:16:39.552859545 +0000 UTC m=+182.484184388" lastFinishedPulling="2026-04-16 22:16:41.516227633 +0000 UTC m=+184.447552473" observedRunningTime="2026-04-16 22:16:45.313649506 +0000 UTC m=+188.244974371" watchObservedRunningTime="2026-04-16 22:16:45.313899692 +0000 UTC m=+188.245224556" Apr 16 22:16:47.294296 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:47.294063 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd" event={"ID":"48b33455-80b7-4a5e-9249-bdd9508d2074","Type":"ContainerStarted","Data":"f1b0be5febb774f0f5beae9b2159de006b1598f0eeab3e3d816b93fa1ba58a05"} Apr 16 22:16:47.294662 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:47.294406 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd" Apr 16 22:16:47.296476 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:47.296445 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" event={"ID":"9e97edd7-009d-4e18-9786-f18ece2787a7","Type":"ContainerStarted","Data":"db5f9006ba2e9a4c0a0278f589eceebb326faa4b275f844054f03ef021298d91"} Apr 16 22:16:47.300519 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:47.300501 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd" Apr 16 22:16:47.316590 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:47.316268 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57c9bfbccc-jp84d"] Apr 16 22:16:47.361824 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:47.361773 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-46xrd" podStartSLOduration=1.971639804 podStartE2EDuration="5.361752218s" podCreationTimestamp="2026-04-16 22:16:42 +0000 UTC" firstStartedPulling="2026-04-16 22:16:43.74937824 +0000 UTC m=+186.680703080" lastFinishedPulling="2026-04-16 22:16:47.139490643 +0000 UTC m=+190.070815494" observedRunningTime="2026-04-16 22:16:47.326065968 +0000 UTC m=+190.257390823" watchObservedRunningTime="2026-04-16 22:16:47.361752218 +0000 UTC m=+190.293077132" Apr 16 22:16:48.186533 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:48.186501 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57c9bfbccc-jp84d"] Apr 16 22:16:48.301631 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:48.301591 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57c9bfbccc-jp84d" event={"ID":"626f6e30-77d2-4e87-a3b9-d4208540ca5a","Type":"ContainerStarted","Data":"6d1fb41c7b46b95d4909f56d1942815f8b6ef2ff6b537477f247490bc4969073"} Apr 16 22:16:48.302141 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:48.302117 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57c9bfbccc-jp84d" event={"ID":"626f6e30-77d2-4e87-a3b9-d4208540ca5a","Type":"ContainerStarted","Data":"f8ea56ae2c243e240493b77fb689314a510baa9f88707718825fae2d10dc6e56"} Apr 16 22:16:48.305257 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:48.305233 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerStarted","Data":"f84c951c16e0726162554d0c4a5502f59a52591a69d2050088884f08023f8919"} Apr 16 22:16:48.305370 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:48.305264 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerStarted","Data":"f46d24ea19bf168bae2c2ad782fc0c96eb145ff6aea091feb5b8fd80761578d2"} Apr 16 22:16:48.305370 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:48.305277 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerStarted","Data":"7771a829ae1bfe9668e13c60f5ccdddfa28d22321e83b5c7315caa280e31a7bf"} Apr 16 22:16:48.305370 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:48.305285 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerStarted","Data":"0a20ddadeb2036f0519d91e06537b8c3d9584d825a1d78798be7154f738acb28"} Apr 16 22:16:48.305370 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:48.305296 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerStarted","Data":"b90309d4b4f72edfeeee58cc5d09c2031e4954cd1690e621643a1142ac965643"} Apr 16 22:16:48.338590 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:48.336129 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57c9bfbccc-jp84d" podStartSLOduration=4.336108639 podStartE2EDuration="4.336108639s" podCreationTimestamp="2026-04-16 22:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:48.331627846 +0000 UTC m=+191.262952711" watchObservedRunningTime="2026-04-16 22:16:48.336108639 +0000 UTC m=+191.267433502" Apr 16 22:16:48.357128 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:48.357061 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" podStartSLOduration=2.918038858 podStartE2EDuration="6.357049949s" podCreationTimestamp="2026-04-16 22:16:42 +0000 UTC" firstStartedPulling="2026-04-16 22:16:43.702433634 +0000 UTC m=+186.633758495" lastFinishedPulling="2026-04-16 22:16:47.141444746 +0000 UTC m=+190.072769586" observedRunningTime="2026-04-16 22:16:48.355439144 +0000 UTC m=+191.286764005" watchObservedRunningTime="2026-04-16 22:16:48.357049949 +0000 UTC m=+191.288374789" Apr 16 22:16:48.903060 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:48.903017 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:16:49.311779 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:49.311733 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerStarted","Data":"aef1361db1642f33ab716c268076bbffb7e6b36ffd61b303c843b58a984984aa"} Apr 16 22:16:49.357971 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:49.357773 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.129788653 podStartE2EDuration="11.357753199s" podCreationTimestamp="2026-04-16 22:16:38 +0000 UTC" firstStartedPulling="2026-04-16 22:16:39.911432418 +0000 UTC m=+182.842757265" lastFinishedPulling="2026-04-16 22:16:49.139396969 +0000 UTC m=+192.070721811" observedRunningTime="2026-04-16 22:16:49.35739784 +0000 UTC m=+192.288722738" watchObservedRunningTime="2026-04-16 22:16:49.357753199 +0000 UTC m=+192.289078062" Apr 16 22:16:54.877104 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:16:54.877067 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:17:02.352907 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:02.352875 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:17:02.352907 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:02.352912 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:17:09.376715 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.376645 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" podUID="b090c0b3-373b-4083-99b5-0851f1e3c94b" containerName="registry" containerID="cri-o://e80b98d2db3bc53956c6222ffbb7c0ad2850631a54e00c2f88e21a7f4c93bc8a" gracePeriod=30 Apr 16 22:17:09.628485 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.628421 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:17:09.809705 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.809667 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-certificates\") pod \"b090c0b3-373b-4083-99b5-0851f1e3c94b\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " Apr 16 22:17:09.809901 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.809715 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b090c0b3-373b-4083-99b5-0851f1e3c94b-image-registry-private-configuration\") pod \"b090c0b3-373b-4083-99b5-0851f1e3c94b\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " Apr 16 22:17:09.809901 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.809751 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls\") pod \"b090c0b3-373b-4083-99b5-0851f1e3c94b\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " Apr 16 22:17:09.809901 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.809859 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b090c0b3-373b-4083-99b5-0851f1e3c94b-ca-trust-extracted\") pod \"b090c0b3-373b-4083-99b5-0851f1e3c94b\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " Apr 16 22:17:09.810058 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.809904 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b090c0b3-373b-4083-99b5-0851f1e3c94b-trusted-ca\") pod \"b090c0b3-373b-4083-99b5-0851f1e3c94b\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " Apr 16 22:17:09.810058 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.809952 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnpsm\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-kube-api-access-gnpsm\") pod \"b090c0b3-373b-4083-99b5-0851f1e3c94b\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " Apr 16 22:17:09.810058 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.809993 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b090c0b3-373b-4083-99b5-0851f1e3c94b-installation-pull-secrets\") pod \"b090c0b3-373b-4083-99b5-0851f1e3c94b\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " Apr 16 22:17:09.810058 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.810027 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-bound-sa-token\") pod \"b090c0b3-373b-4083-99b5-0851f1e3c94b\" (UID: \"b090c0b3-373b-4083-99b5-0851f1e3c94b\") " Apr 16 22:17:09.810259 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.810196 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b090c0b3-373b-4083-99b5-0851f1e3c94b" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:09.810381 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.810356 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-certificates\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:09.810381 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.810362 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b090c0b3-373b-4083-99b5-0851f1e3c94b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b090c0b3-373b-4083-99b5-0851f1e3c94b" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:09.812234 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.812208 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b090c0b3-373b-4083-99b5-0851f1e3c94b" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:09.812369 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.812330 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b090c0b3-373b-4083-99b5-0851f1e3c94b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b090c0b3-373b-4083-99b5-0851f1e3c94b" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:09.812420 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.812405 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b090c0b3-373b-4083-99b5-0851f1e3c94b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b090c0b3-373b-4083-99b5-0851f1e3c94b" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:09.812727 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.812704 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b090c0b3-373b-4083-99b5-0851f1e3c94b" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:09.812850 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.812790 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-kube-api-access-gnpsm" (OuterVolumeSpecName: "kube-api-access-gnpsm") pod "b090c0b3-373b-4083-99b5-0851f1e3c94b" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b"). InnerVolumeSpecName "kube-api-access-gnpsm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:09.818617 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.818571 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b090c0b3-373b-4083-99b5-0851f1e3c94b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b090c0b3-373b-4083-99b5-0851f1e3c94b" (UID: "b090c0b3-373b-4083-99b5-0851f1e3c94b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:17:09.911494 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.911405 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b090c0b3-373b-4083-99b5-0851f1e3c94b-installation-pull-secrets\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:09.911494 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.911438 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-bound-sa-token\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:09.911494 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.911449 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b090c0b3-373b-4083-99b5-0851f1e3c94b-image-registry-private-configuration\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:09.911494 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.911458 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-registry-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:09.911494 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.911468 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b090c0b3-373b-4083-99b5-0851f1e3c94b-ca-trust-extracted\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:09.911494 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.911480 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b090c0b3-373b-4083-99b5-0851f1e3c94b-trusted-ca\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:09.911494 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:09.911488 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gnpsm\" (UniqueName: \"kubernetes.io/projected/b090c0b3-373b-4083-99b5-0851f1e3c94b-kube-api-access-gnpsm\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:10.374941 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:10.374909 2571 generic.go:358] "Generic (PLEG): container finished" podID="b090c0b3-373b-4083-99b5-0851f1e3c94b" containerID="e80b98d2db3bc53956c6222ffbb7c0ad2850631a54e00c2f88e21a7f4c93bc8a" exitCode=0 Apr 16 22:17:10.375081 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:10.374956 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" event={"ID":"b090c0b3-373b-4083-99b5-0851f1e3c94b","Type":"ContainerDied","Data":"e80b98d2db3bc53956c6222ffbb7c0ad2850631a54e00c2f88e21a7f4c93bc8a"} Apr 16 22:17:10.375081 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:10.374979 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" event={"ID":"b090c0b3-373b-4083-99b5-0851f1e3c94b","Type":"ContainerDied","Data":"9ee558f23ff2236190c64e52d543f6996e84f7a856f9a1bd0e2f4ab804a6ecae"} Apr 16 22:17:10.375081 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:10.374987 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76bb79884b-57jt7" Apr 16 22:17:10.375200 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:10.374993 2571 scope.go:117] "RemoveContainer" containerID="e80b98d2db3bc53956c6222ffbb7c0ad2850631a54e00c2f88e21a7f4c93bc8a" Apr 16 22:17:10.384879 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:10.384722 2571 scope.go:117] "RemoveContainer" containerID="e80b98d2db3bc53956c6222ffbb7c0ad2850631a54e00c2f88e21a7f4c93bc8a" Apr 16 22:17:10.385157 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:17:10.385099 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e80b98d2db3bc53956c6222ffbb7c0ad2850631a54e00c2f88e21a7f4c93bc8a\": container with ID starting with e80b98d2db3bc53956c6222ffbb7c0ad2850631a54e00c2f88e21a7f4c93bc8a not found: ID does not exist" containerID="e80b98d2db3bc53956c6222ffbb7c0ad2850631a54e00c2f88e21a7f4c93bc8a" Apr 16 22:17:10.385201 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:10.385146 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e80b98d2db3bc53956c6222ffbb7c0ad2850631a54e00c2f88e21a7f4c93bc8a"} err="failed to get container status \"e80b98d2db3bc53956c6222ffbb7c0ad2850631a54e00c2f88e21a7f4c93bc8a\": rpc error: code = NotFound desc = could not find container \"e80b98d2db3bc53956c6222ffbb7c0ad2850631a54e00c2f88e21a7f4c93bc8a\": container with ID starting with e80b98d2db3bc53956c6222ffbb7c0ad2850631a54e00c2f88e21a7f4c93bc8a not found: ID does not exist" Apr 16 22:17:10.400464 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:10.400435 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-76bb79884b-57jt7"] Apr 16 22:17:10.408522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:10.408496 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-76bb79884b-57jt7"] Apr 16 22:17:11.313572 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.313513 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6bbb95b95-29q9s" podUID="3019bfd0-c68c-419b-8cf7-0fb50f0c5a37" containerName="console" containerID="cri-o://6fe3130e7cb3a1741007e5c0af79835d6b0d484467ac58878739dc0cd430b390" gracePeriod=15 Apr 16 22:17:11.587329 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.587307 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bbb95b95-29q9s_3019bfd0-c68c-419b-8cf7-0fb50f0c5a37/console/0.log" Apr 16 22:17:11.587663 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.587368 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:17:11.679243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.679212 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b090c0b3-373b-4083-99b5-0851f1e3c94b" path="/var/lib/kubelet/pods/b090c0b3-373b-4083-99b5-0851f1e3c94b/volumes" Apr 16 22:17:11.727507 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.727476 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-oauth-config\") pod \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " Apr 16 22:17:11.727507 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.727509 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-oauth-serving-cert\") pod \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " Apr 16 22:17:11.727752 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.727545 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-config\") pod \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " Apr 16 22:17:11.727752 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.727597 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-trusted-ca-bundle\") pod \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " Apr 16 22:17:11.727752 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.727634 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-serving-cert\") pod \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " Apr 16 22:17:11.727752 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.727660 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzqsh\" (UniqueName: \"kubernetes.io/projected/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-kube-api-access-pzqsh\") pod \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " Apr 16 22:17:11.727752 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.727698 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-service-ca\") pod \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\" (UID: \"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37\") " Apr 16 22:17:11.728111 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.728082 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37" (UID: "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:11.728228 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.728105 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-config" (OuterVolumeSpecName: "console-config") pod "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37" (UID: "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:11.728228 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.728122 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-service-ca" (OuterVolumeSpecName: "service-ca") pod "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37" (UID: "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:11.728228 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.728126 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37" (UID: "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:11.728460 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.728441 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-service-ca\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:11.728507 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.728468 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-oauth-serving-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:11.728507 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.728485 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:11.728507 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.728502 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-trusted-ca-bundle\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:11.730094 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.730060 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-kube-api-access-pzqsh" (OuterVolumeSpecName: "kube-api-access-pzqsh") pod "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37" (UID: "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37"). InnerVolumeSpecName "kube-api-access-pzqsh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:11.730188 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.730170 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37" (UID: "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:11.730240 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.730187 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37" (UID: "3019bfd0-c68c-419b-8cf7-0fb50f0c5a37"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:11.830023 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.829928 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-oauth-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:11.830023 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.829955 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-console-serving-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:11.830023 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:11.829969 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pzqsh\" (UniqueName: \"kubernetes.io/projected/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37-kube-api-access-pzqsh\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:12.382447 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:12.382415 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bbb95b95-29q9s_3019bfd0-c68c-419b-8cf7-0fb50f0c5a37/console/0.log" Apr 16 22:17:12.382643 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:12.382453 2571 generic.go:358] "Generic (PLEG): container finished" podID="3019bfd0-c68c-419b-8cf7-0fb50f0c5a37" containerID="6fe3130e7cb3a1741007e5c0af79835d6b0d484467ac58878739dc0cd430b390" exitCode=2 Apr 16 22:17:12.382643 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:12.382486 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bbb95b95-29q9s" event={"ID":"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37","Type":"ContainerDied","Data":"6fe3130e7cb3a1741007e5c0af79835d6b0d484467ac58878739dc0cd430b390"} Apr 16 22:17:12.382643 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:12.382517 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bbb95b95-29q9s" Apr 16 22:17:12.382643 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:12.382527 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bbb95b95-29q9s" event={"ID":"3019bfd0-c68c-419b-8cf7-0fb50f0c5a37","Type":"ContainerDied","Data":"598019de236b1fbec9232f96155d6708a93da0c32ab7cbd0ae2ba49628491041"} Apr 16 22:17:12.382643 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:12.382543 2571 scope.go:117] "RemoveContainer" containerID="6fe3130e7cb3a1741007e5c0af79835d6b0d484467ac58878739dc0cd430b390" Apr 16 22:17:12.390437 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:12.390420 2571 scope.go:117] "RemoveContainer" containerID="6fe3130e7cb3a1741007e5c0af79835d6b0d484467ac58878739dc0cd430b390" Apr 16 22:17:12.390704 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:17:12.390678 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fe3130e7cb3a1741007e5c0af79835d6b0d484467ac58878739dc0cd430b390\": container with ID starting with 6fe3130e7cb3a1741007e5c0af79835d6b0d484467ac58878739dc0cd430b390 not found: ID does not exist" containerID="6fe3130e7cb3a1741007e5c0af79835d6b0d484467ac58878739dc0cd430b390" Apr 16 22:17:12.390798 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:12.390707 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fe3130e7cb3a1741007e5c0af79835d6b0d484467ac58878739dc0cd430b390"} err="failed to get container status \"6fe3130e7cb3a1741007e5c0af79835d6b0d484467ac58878739dc0cd430b390\": rpc error: code = NotFound desc = could not find container \"6fe3130e7cb3a1741007e5c0af79835d6b0d484467ac58878739dc0cd430b390\": container with ID starting with 6fe3130e7cb3a1741007e5c0af79835d6b0d484467ac58878739dc0cd430b390 not found: ID does not exist" Apr 16 22:17:12.401627 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:12.401600 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bbb95b95-29q9s"] Apr 16 22:17:12.403718 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:12.403698 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6bbb95b95-29q9s"] Apr 16 22:17:13.386991 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:13.386957 2571 generic.go:358] "Generic (PLEG): container finished" podID="c7e9979f-4063-42fa-aa5b-a0d2d60f93a5" containerID="8d049bcd1a9293f5cecdb850eeece2c08af48d454f66871e93b80f7250fb4413" exitCode=0 Apr 16 22:17:13.387397 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:13.387030 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" event={"ID":"c7e9979f-4063-42fa-aa5b-a0d2d60f93a5","Type":"ContainerDied","Data":"8d049bcd1a9293f5cecdb850eeece2c08af48d454f66871e93b80f7250fb4413"} Apr 16 22:17:13.387457 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:13.387396 2571 scope.go:117] "RemoveContainer" containerID="8d049bcd1a9293f5cecdb850eeece2c08af48d454f66871e93b80f7250fb4413" Apr 16 22:17:13.679164 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:13.679087 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3019bfd0-c68c-419b-8cf7-0fb50f0c5a37" path="/var/lib/kubelet/pods/3019bfd0-c68c-419b-8cf7-0fb50f0c5a37/volumes" Apr 16 22:17:14.334305 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.334241 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-57c9bfbccc-jp84d" podUID="626f6e30-77d2-4e87-a3b9-d4208540ca5a" containerName="console" containerID="cri-o://6d1fb41c7b46b95d4909f56d1942815f8b6ef2ff6b537477f247490bc4969073" gracePeriod=15 Apr 16 22:17:14.392200 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.392161 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lntwp" event={"ID":"c7e9979f-4063-42fa-aa5b-a0d2d60f93a5","Type":"ContainerStarted","Data":"0b8743927d7a49a07110654564a388624568c5135410a4fc2f5d56a7aa9f38d2"} Apr 16 22:17:14.599877 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.599855 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57c9bfbccc-jp84d_626f6e30-77d2-4e87-a3b9-d4208540ca5a/console/0.log" Apr 16 22:17:14.599997 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.599913 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:17:14.755650 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.755613 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkdrz\" (UniqueName: \"kubernetes.io/projected/626f6e30-77d2-4e87-a3b9-d4208540ca5a-kube-api-access-jkdrz\") pod \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " Apr 16 22:17:14.755832 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.755668 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-service-ca\") pod \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " Apr 16 22:17:14.755832 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.755727 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-oauth-config\") pod \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " Apr 16 22:17:14.755832 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.755761 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-oauth-serving-cert\") pod \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " Apr 16 22:17:14.755832 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.755781 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-config\") pod \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " Apr 16 22:17:14.755832 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.755796 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-trusted-ca-bundle\") pod \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " Apr 16 22:17:14.755832 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.755833 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-serving-cert\") pod \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\" (UID: \"626f6e30-77d2-4e87-a3b9-d4208540ca5a\") " Apr 16 22:17:14.756136 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.756013 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-service-ca" (OuterVolumeSpecName: "service-ca") pod "626f6e30-77d2-4e87-a3b9-d4208540ca5a" (UID: "626f6e30-77d2-4e87-a3b9-d4208540ca5a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:14.756291 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.756264 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "626f6e30-77d2-4e87-a3b9-d4208540ca5a" (UID: "626f6e30-77d2-4e87-a3b9-d4208540ca5a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:14.756375 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.756284 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-config" (OuterVolumeSpecName: "console-config") pod "626f6e30-77d2-4e87-a3b9-d4208540ca5a" (UID: "626f6e30-77d2-4e87-a3b9-d4208540ca5a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:14.756375 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.756337 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "626f6e30-77d2-4e87-a3b9-d4208540ca5a" (UID: "626f6e30-77d2-4e87-a3b9-d4208540ca5a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:14.757997 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.757974 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "626f6e30-77d2-4e87-a3b9-d4208540ca5a" (UID: "626f6e30-77d2-4e87-a3b9-d4208540ca5a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:14.757997 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.757987 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "626f6e30-77d2-4e87-a3b9-d4208540ca5a" (UID: "626f6e30-77d2-4e87-a3b9-d4208540ca5a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:14.758119 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.758058 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/626f6e30-77d2-4e87-a3b9-d4208540ca5a-kube-api-access-jkdrz" (OuterVolumeSpecName: "kube-api-access-jkdrz") pod "626f6e30-77d2-4e87-a3b9-d4208540ca5a" (UID: "626f6e30-77d2-4e87-a3b9-d4208540ca5a"). InnerVolumeSpecName "kube-api-access-jkdrz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:14.857610 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.857476 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-oauth-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:14.857610 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.857516 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-oauth-serving-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:14.857610 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.857529 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:14.857610 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.857542 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-trusted-ca-bundle\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:14.857610 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.857577 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/626f6e30-77d2-4e87-a3b9-d4208540ca5a-console-serving-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:14.857610 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.857589 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jkdrz\" (UniqueName: \"kubernetes.io/projected/626f6e30-77d2-4e87-a3b9-d4208540ca5a-kube-api-access-jkdrz\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:14.857610 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:14.857601 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/626f6e30-77d2-4e87-a3b9-d4208540ca5a-service-ca\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:15.396164 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:15.396137 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57c9bfbccc-jp84d_626f6e30-77d2-4e87-a3b9-d4208540ca5a/console/0.log" Apr 16 22:17:15.396570 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:15.396178 2571 generic.go:358] "Generic (PLEG): container finished" podID="626f6e30-77d2-4e87-a3b9-d4208540ca5a" containerID="6d1fb41c7b46b95d4909f56d1942815f8b6ef2ff6b537477f247490bc4969073" exitCode=2 Apr 16 22:17:15.396570 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:15.396225 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57c9bfbccc-jp84d" event={"ID":"626f6e30-77d2-4e87-a3b9-d4208540ca5a","Type":"ContainerDied","Data":"6d1fb41c7b46b95d4909f56d1942815f8b6ef2ff6b537477f247490bc4969073"} Apr 16 22:17:15.396570 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:15.396246 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57c9bfbccc-jp84d" Apr 16 22:17:15.396570 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:15.396266 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57c9bfbccc-jp84d" event={"ID":"626f6e30-77d2-4e87-a3b9-d4208540ca5a","Type":"ContainerDied","Data":"f8ea56ae2c243e240493b77fb689314a510baa9f88707718825fae2d10dc6e56"} Apr 16 22:17:15.396570 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:15.396282 2571 scope.go:117] "RemoveContainer" containerID="6d1fb41c7b46b95d4909f56d1942815f8b6ef2ff6b537477f247490bc4969073" Apr 16 22:17:15.405102 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:15.405088 2571 scope.go:117] "RemoveContainer" containerID="6d1fb41c7b46b95d4909f56d1942815f8b6ef2ff6b537477f247490bc4969073" Apr 16 22:17:15.405359 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:17:15.405341 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d1fb41c7b46b95d4909f56d1942815f8b6ef2ff6b537477f247490bc4969073\": container with ID starting with 6d1fb41c7b46b95d4909f56d1942815f8b6ef2ff6b537477f247490bc4969073 not found: ID does not exist" containerID="6d1fb41c7b46b95d4909f56d1942815f8b6ef2ff6b537477f247490bc4969073" Apr 16 22:17:15.405416 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:15.405367 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1fb41c7b46b95d4909f56d1942815f8b6ef2ff6b537477f247490bc4969073"} err="failed to get container status \"6d1fb41c7b46b95d4909f56d1942815f8b6ef2ff6b537477f247490bc4969073\": rpc error: code = NotFound desc = could not find container \"6d1fb41c7b46b95d4909f56d1942815f8b6ef2ff6b537477f247490bc4969073\": container with ID starting with 6d1fb41c7b46b95d4909f56d1942815f8b6ef2ff6b537477f247490bc4969073 not found: ID does not exist" Apr 16 22:17:15.415141 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:15.415119 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57c9bfbccc-jp84d"] Apr 16 22:17:15.419119 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:15.419090 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-57c9bfbccc-jp84d"] Apr 16 22:17:15.678940 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:15.678860 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="626f6e30-77d2-4e87-a3b9-d4208540ca5a" path="/var/lib/kubelet/pods/626f6e30-77d2-4e87-a3b9-d4208540ca5a/volumes" Apr 16 22:17:22.358336 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:22.358298 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:17:22.362237 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:22.362205 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5fdbdc7cff-g759t" Apr 16 22:17:49.437595 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:49.437526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:17:49.439961 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:49.439927 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e24f7f3c-00b2-43d5-9a49-1b7ee75125a1-metrics-certs\") pod \"network-metrics-daemon-2f4gk\" (UID: \"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1\") " pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:17:49.478202 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:49.478176 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7dlkb\"" Apr 16 22:17:49.486280 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:49.486257 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2f4gk" Apr 16 22:17:49.601862 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:49.601837 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2f4gk"] Apr 16 22:17:49.604352 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:17:49.604322 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode24f7f3c_00b2_43d5_9a49_1b7ee75125a1.slice/crio-7bdaedb71d1acfda1d3816adea94b4ca74e870f3cf3ca77dd5dd04ef31848dca WatchSource:0}: Error finding container 7bdaedb71d1acfda1d3816adea94b4ca74e870f3cf3ca77dd5dd04ef31848dca: Status 404 returned error can't find the container with id 7bdaedb71d1acfda1d3816adea94b4ca74e870f3cf3ca77dd5dd04ef31848dca Apr 16 22:17:50.497770 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:50.497735 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2f4gk" event={"ID":"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1","Type":"ContainerStarted","Data":"7bdaedb71d1acfda1d3816adea94b4ca74e870f3cf3ca77dd5dd04ef31848dca"} Apr 16 22:17:51.501970 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:51.501930 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2f4gk" event={"ID":"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1","Type":"ContainerStarted","Data":"e23d226716aa21075a37f47df119a8bec202375a85eeb080df7d6b9170b6bdfd"} Apr 16 22:17:51.501970 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:51.501973 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2f4gk" event={"ID":"e24f7f3c-00b2-43d5-9a49-1b7ee75125a1","Type":"ContainerStarted","Data":"58768f5a94447ad8b7b51a5f558d3bd8e3b0ac554e29bbcaafdaf4889d5f2eac"} Apr 16 22:17:51.519394 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:51.519349 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2f4gk" podStartSLOduration=253.601808899 podStartE2EDuration="4m14.519335682s" podCreationTimestamp="2026-04-16 22:13:37 +0000 UTC" firstStartedPulling="2026-04-16 22:17:49.606501753 +0000 UTC m=+252.537826594" lastFinishedPulling="2026-04-16 22:17:50.524028533 +0000 UTC m=+253.455353377" observedRunningTime="2026-04-16 22:17:51.517269841 +0000 UTC m=+254.448594702" watchObservedRunningTime="2026-04-16 22:17:51.519335682 +0000 UTC m=+254.450660543" Apr 16 22:17:58.468672 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:58.468585 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:17:58.469172 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:58.469021 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="alertmanager" containerID="cri-o://b90309d4b4f72edfeeee58cc5d09c2031e4954cd1690e621643a1142ac965643" gracePeriod=120 Apr 16 22:17:58.469172 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:58.469060 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="kube-rbac-proxy-metric" containerID="cri-o://f84c951c16e0726162554d0c4a5502f59a52591a69d2050088884f08023f8919" gracePeriod=120 Apr 16 22:17:58.469172 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:58.469115 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="kube-rbac-proxy-web" containerID="cri-o://7771a829ae1bfe9668e13c60f5ccdddfa28d22321e83b5c7315caa280e31a7bf" gracePeriod=120 Apr 16 22:17:58.469172 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:58.469142 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="kube-rbac-proxy" containerID="cri-o://f46d24ea19bf168bae2c2ad782fc0c96eb145ff6aea091feb5b8fd80761578d2" gracePeriod=120 Apr 16 22:17:58.469388 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:58.469121 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="prom-label-proxy" containerID="cri-o://aef1361db1642f33ab716c268076bbffb7e6b36ffd61b303c843b58a984984aa" gracePeriod=120 Apr 16 22:17:58.469388 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:58.469186 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="config-reloader" containerID="cri-o://0a20ddadeb2036f0519d91e06537b8c3d9584d825a1d78798be7154f738acb28" gracePeriod=120 Apr 16 22:17:59.528387 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.528355 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2aef76a-b4c3-442a-ac34-144815b90018" containerID="aef1361db1642f33ab716c268076bbffb7e6b36ffd61b303c843b58a984984aa" exitCode=0 Apr 16 22:17:59.528387 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.528378 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2aef76a-b4c3-442a-ac34-144815b90018" containerID="f84c951c16e0726162554d0c4a5502f59a52591a69d2050088884f08023f8919" exitCode=0 Apr 16 22:17:59.528387 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.528384 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2aef76a-b4c3-442a-ac34-144815b90018" containerID="f46d24ea19bf168bae2c2ad782fc0c96eb145ff6aea091feb5b8fd80761578d2" exitCode=0 Apr 16 22:17:59.528387 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.528390 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2aef76a-b4c3-442a-ac34-144815b90018" containerID="0a20ddadeb2036f0519d91e06537b8c3d9584d825a1d78798be7154f738acb28" exitCode=0 Apr 16 22:17:59.528387 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.528395 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2aef76a-b4c3-442a-ac34-144815b90018" containerID="b90309d4b4f72edfeeee58cc5d09c2031e4954cd1690e621643a1142ac965643" exitCode=0 Apr 16 22:17:59.528881 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.528416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerDied","Data":"aef1361db1642f33ab716c268076bbffb7e6b36ffd61b303c843b58a984984aa"} Apr 16 22:17:59.528881 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.528437 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerDied","Data":"f84c951c16e0726162554d0c4a5502f59a52591a69d2050088884f08023f8919"} Apr 16 22:17:59.528881 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.528447 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerDied","Data":"f46d24ea19bf168bae2c2ad782fc0c96eb145ff6aea091feb5b8fd80761578d2"} Apr 16 22:17:59.528881 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.528455 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerDied","Data":"0a20ddadeb2036f0519d91e06537b8c3d9584d825a1d78798be7154f738acb28"} Apr 16 22:17:59.528881 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.528464 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerDied","Data":"b90309d4b4f72edfeeee58cc5d09c2031e4954cd1690e621643a1142ac965643"} Apr 16 22:17:59.719583 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.719540 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:59.818616 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.818526 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d2aef76a-b4c3-442a-ac34-144815b90018-config-out\") pod \"d2aef76a-b4c3-442a-ac34-144815b90018\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " Apr 16 22:17:59.818616 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.818599 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-main-tls\") pod \"d2aef76a-b4c3-442a-ac34-144815b90018\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " Apr 16 22:17:59.818823 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.818631 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-config-volume\") pod \"d2aef76a-b4c3-442a-ac34-144815b90018\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " Apr 16 22:17:59.818823 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.818653 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-trusted-ca-bundle\") pod \"d2aef76a-b4c3-442a-ac34-144815b90018\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " Apr 16 22:17:59.818823 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.818691 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-web-config\") pod \"d2aef76a-b4c3-442a-ac34-144815b90018\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " Apr 16 22:17:59.818823 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.818808 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d2aef76a-b4c3-442a-ac34-144815b90018-tls-assets\") pod \"d2aef76a-b4c3-442a-ac34-144815b90018\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " Apr 16 22:17:59.819026 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.818852 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-main-db\") pod \"d2aef76a-b4c3-442a-ac34-144815b90018\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " Apr 16 22:17:59.819026 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.818889 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy\") pod \"d2aef76a-b4c3-442a-ac34-144815b90018\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " Apr 16 22:17:59.819026 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.818949 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcz9v\" (UniqueName: \"kubernetes.io/projected/d2aef76a-b4c3-442a-ac34-144815b90018-kube-api-access-jcz9v\") pod \"d2aef76a-b4c3-442a-ac34-144815b90018\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " Apr 16 22:17:59.819026 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.818979 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy-web\") pod \"d2aef76a-b4c3-442a-ac34-144815b90018\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " Apr 16 22:17:59.819026 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.819015 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-metrics-client-ca\") pod \"d2aef76a-b4c3-442a-ac34-144815b90018\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " Apr 16 22:17:59.819272 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.819045 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d2aef76a-b4c3-442a-ac34-144815b90018\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " Apr 16 22:17:59.819272 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.819041 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d2aef76a-b4c3-442a-ac34-144815b90018" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:59.819272 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.819075 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-cluster-tls-config\") pod \"d2aef76a-b4c3-442a-ac34-144815b90018\" (UID: \"d2aef76a-b4c3-442a-ac34-144815b90018\") " Apr 16 22:17:59.819434 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.819391 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.820168 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.820114 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d2aef76a-b4c3-442a-ac34-144815b90018" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:59.820296 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.820178 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d2aef76a-b4c3-442a-ac34-144815b90018" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:17:59.821383 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.821331 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d2aef76a-b4c3-442a-ac34-144815b90018" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.822178 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.822146 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2aef76a-b4c3-442a-ac34-144815b90018-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d2aef76a-b4c3-442a-ac34-144815b90018" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:59.822278 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.822216 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2aef76a-b4c3-442a-ac34-144815b90018-kube-api-access-jcz9v" (OuterVolumeSpecName: "kube-api-access-jcz9v") pod "d2aef76a-b4c3-442a-ac34-144815b90018" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018"). InnerVolumeSpecName "kube-api-access-jcz9v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:59.822431 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.822395 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2aef76a-b4c3-442a-ac34-144815b90018-config-out" (OuterVolumeSpecName: "config-out") pod "d2aef76a-b4c3-442a-ac34-144815b90018" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:17:59.822539 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.822458 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-config-volume" (OuterVolumeSpecName: "config-volume") pod "d2aef76a-b4c3-442a-ac34-144815b90018" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.822804 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.822767 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d2aef76a-b4c3-442a-ac34-144815b90018" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.823639 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.823611 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d2aef76a-b4c3-442a-ac34-144815b90018" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.823723 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.823664 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d2aef76a-b4c3-442a-ac34-144815b90018" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.825959 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.825939 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d2aef76a-b4c3-442a-ac34-144815b90018" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.832721 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.832694 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-web-config" (OuterVolumeSpecName: "web-config") pod "d2aef76a-b4c3-442a-ac34-144815b90018" (UID: "d2aef76a-b4c3-442a-ac34-144815b90018"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:59.920208 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.920184 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-web-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.920208 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.920206 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d2aef76a-b4c3-442a-ac34-144815b90018-tls-assets\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.920329 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.920216 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d2aef76a-b4c3-442a-ac34-144815b90018-alertmanager-main-db\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.920329 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.920226 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.920329 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.920235 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jcz9v\" (UniqueName: \"kubernetes.io/projected/d2aef76a-b4c3-442a-ac34-144815b90018-kube-api-access-jcz9v\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.920329 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.920245 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.920329 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.920254 2571 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2aef76a-b4c3-442a-ac34-144815b90018-metrics-client-ca\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.920329 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.920263 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.920329 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.920271 2571 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-cluster-tls-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.920329 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.920280 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d2aef76a-b4c3-442a-ac34-144815b90018-config-out\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.920329 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.920289 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-secret-alertmanager-main-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:17:59.920329 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:17:59.920298 2571 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d2aef76a-b4c3-442a-ac34-144815b90018-config-volume\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:18:00.534330 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.534299 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2aef76a-b4c3-442a-ac34-144815b90018" containerID="7771a829ae1bfe9668e13c60f5ccdddfa28d22321e83b5c7315caa280e31a7bf" exitCode=0 Apr 16 22:18:00.534804 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.534342 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerDied","Data":"7771a829ae1bfe9668e13c60f5ccdddfa28d22321e83b5c7315caa280e31a7bf"} Apr 16 22:18:00.534804 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.534364 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d2aef76a-b4c3-442a-ac34-144815b90018","Type":"ContainerDied","Data":"2d0f70a0b0bc1e963b2269f15b5bef4d2c794b7a04886f707be91d0465e709a5"} Apr 16 22:18:00.534804 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.534380 2571 scope.go:117] "RemoveContainer" containerID="aef1361db1642f33ab716c268076bbffb7e6b36ffd61b303c843b58a984984aa" Apr 16 22:18:00.534804 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.534408 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.542834 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.542813 2571 scope.go:117] "RemoveContainer" containerID="f84c951c16e0726162554d0c4a5502f59a52591a69d2050088884f08023f8919" Apr 16 22:18:00.553101 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.553083 2571 scope.go:117] "RemoveContainer" containerID="f46d24ea19bf168bae2c2ad782fc0c96eb145ff6aea091feb5b8fd80761578d2" Apr 16 22:18:00.558215 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.558195 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:18:00.560315 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.560292 2571 scope.go:117] "RemoveContainer" containerID="7771a829ae1bfe9668e13c60f5ccdddfa28d22321e83b5c7315caa280e31a7bf" Apr 16 22:18:00.563836 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.563816 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:18:00.567395 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.567377 2571 scope.go:117] "RemoveContainer" containerID="0a20ddadeb2036f0519d91e06537b8c3d9584d825a1d78798be7154f738acb28" Apr 16 22:18:00.573425 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.573407 2571 scope.go:117] "RemoveContainer" containerID="b90309d4b4f72edfeeee58cc5d09c2031e4954cd1690e621643a1142ac965643" Apr 16 22:18:00.581694 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.581674 2571 scope.go:117] "RemoveContainer" containerID="47f611f8a4ed6e0520d4c14e95e9f41b03bdeb4ec9805f8d505c1adffc5c223a" Apr 16 22:18:00.588709 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.588504 2571 scope.go:117] "RemoveContainer" containerID="aef1361db1642f33ab716c268076bbffb7e6b36ffd61b303c843b58a984984aa" Apr 16 22:18:00.588834 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:18:00.588810 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aef1361db1642f33ab716c268076bbffb7e6b36ffd61b303c843b58a984984aa\": container with ID starting with aef1361db1642f33ab716c268076bbffb7e6b36ffd61b303c843b58a984984aa not found: ID does not exist" containerID="aef1361db1642f33ab716c268076bbffb7e6b36ffd61b303c843b58a984984aa" Apr 16 22:18:00.588885 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.588848 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef1361db1642f33ab716c268076bbffb7e6b36ffd61b303c843b58a984984aa"} err="failed to get container status \"aef1361db1642f33ab716c268076bbffb7e6b36ffd61b303c843b58a984984aa\": rpc error: code = NotFound desc = could not find container \"aef1361db1642f33ab716c268076bbffb7e6b36ffd61b303c843b58a984984aa\": container with ID starting with aef1361db1642f33ab716c268076bbffb7e6b36ffd61b303c843b58a984984aa not found: ID does not exist" Apr 16 22:18:00.588885 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.588880 2571 scope.go:117] "RemoveContainer" containerID="f84c951c16e0726162554d0c4a5502f59a52591a69d2050088884f08023f8919" Apr 16 22:18:00.588964 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.588823 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:18:00.589102 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:18:00.589081 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84c951c16e0726162554d0c4a5502f59a52591a69d2050088884f08023f8919\": container with ID starting with f84c951c16e0726162554d0c4a5502f59a52591a69d2050088884f08023f8919 not found: ID does not exist" containerID="f84c951c16e0726162554d0c4a5502f59a52591a69d2050088884f08023f8919" Apr 16 22:18:00.589148 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589111 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84c951c16e0726162554d0c4a5502f59a52591a69d2050088884f08023f8919"} err="failed to get container status \"f84c951c16e0726162554d0c4a5502f59a52591a69d2050088884f08023f8919\": rpc error: code = NotFound desc = could not find container \"f84c951c16e0726162554d0c4a5502f59a52591a69d2050088884f08023f8919\": container with ID starting with f84c951c16e0726162554d0c4a5502f59a52591a69d2050088884f08023f8919 not found: ID does not exist" Apr 16 22:18:00.589148 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589131 2571 scope.go:117] "RemoveContainer" containerID="f46d24ea19bf168bae2c2ad782fc0c96eb145ff6aea091feb5b8fd80761578d2" Apr 16 22:18:00.589229 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589218 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="kube-rbac-proxy-web" Apr 16 22:18:00.589279 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589230 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="kube-rbac-proxy-web" Apr 16 22:18:00.589279 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589240 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b090c0b3-373b-4083-99b5-0851f1e3c94b" containerName="registry" Apr 16 22:18:00.589279 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589248 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b090c0b3-373b-4083-99b5-0851f1e3c94b" containerName="registry" Apr 16 22:18:00.589279 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589258 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="kube-rbac-proxy" Apr 16 22:18:00.589279 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589264 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="kube-rbac-proxy" Apr 16 22:18:00.589279 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589272 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="626f6e30-77d2-4e87-a3b9-d4208540ca5a" containerName="console" Apr 16 22:18:00.589279 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589277 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="626f6e30-77d2-4e87-a3b9-d4208540ca5a" containerName="console" Apr 16 22:18:00.589279 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589283 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="kube-rbac-proxy-metric" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589288 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="kube-rbac-proxy-metric" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589294 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="config-reloader" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589301 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="config-reloader" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589318 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="prom-label-proxy" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589327 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="prom-label-proxy" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589348 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="init-config-reloader" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589355 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="init-config-reloader" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589363 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="alertmanager" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589368 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="alertmanager" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589375 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3019bfd0-c68c-419b-8cf7-0fb50f0c5a37" containerName="console" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589380 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="3019bfd0-c68c-419b-8cf7-0fb50f0c5a37" containerName="console" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:18:00.589396 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f46d24ea19bf168bae2c2ad782fc0c96eb145ff6aea091feb5b8fd80761578d2\": container with ID starting with f46d24ea19bf168bae2c2ad782fc0c96eb145ff6aea091feb5b8fd80761578d2 not found: ID does not exist" containerID="f46d24ea19bf168bae2c2ad782fc0c96eb145ff6aea091feb5b8fd80761578d2" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589434 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="alertmanager" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589446 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="kube-rbac-proxy-metric" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589452 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="kube-rbac-proxy-web" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589458 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="626f6e30-77d2-4e87-a3b9-d4208540ca5a" containerName="console" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589465 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="config-reloader" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589472 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="3019bfd0-c68c-419b-8cf7-0fb50f0c5a37" containerName="console" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589479 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="prom-label-proxy" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589485 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b090c0b3-373b-4083-99b5-0851f1e3c94b" containerName="registry" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589491 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" containerName="kube-rbac-proxy" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589436 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f46d24ea19bf168bae2c2ad782fc0c96eb145ff6aea091feb5b8fd80761578d2"} err="failed to get container status \"f46d24ea19bf168bae2c2ad782fc0c96eb145ff6aea091feb5b8fd80761578d2\": rpc error: code = NotFound desc = could not find container \"f46d24ea19bf168bae2c2ad782fc0c96eb145ff6aea091feb5b8fd80761578d2\": container with ID starting with f46d24ea19bf168bae2c2ad782fc0c96eb145ff6aea091feb5b8fd80761578d2 not found: ID does not exist" Apr 16 22:18:00.589522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589525 2571 scope.go:117] "RemoveContainer" containerID="7771a829ae1bfe9668e13c60f5ccdddfa28d22321e83b5c7315caa280e31a7bf" Apr 16 22:18:00.590336 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:18:00.589837 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7771a829ae1bfe9668e13c60f5ccdddfa28d22321e83b5c7315caa280e31a7bf\": container with ID starting with 7771a829ae1bfe9668e13c60f5ccdddfa28d22321e83b5c7315caa280e31a7bf not found: ID does not exist" containerID="7771a829ae1bfe9668e13c60f5ccdddfa28d22321e83b5c7315caa280e31a7bf" Apr 16 22:18:00.590336 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589853 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7771a829ae1bfe9668e13c60f5ccdddfa28d22321e83b5c7315caa280e31a7bf"} err="failed to get container status \"7771a829ae1bfe9668e13c60f5ccdddfa28d22321e83b5c7315caa280e31a7bf\": rpc error: code = NotFound desc = could not find container \"7771a829ae1bfe9668e13c60f5ccdddfa28d22321e83b5c7315caa280e31a7bf\": container with ID starting with 7771a829ae1bfe9668e13c60f5ccdddfa28d22321e83b5c7315caa280e31a7bf not found: ID does not exist" Apr 16 22:18:00.590336 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.589877 2571 scope.go:117] "RemoveContainer" containerID="0a20ddadeb2036f0519d91e06537b8c3d9584d825a1d78798be7154f738acb28" Apr 16 22:18:00.590336 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:18:00.590096 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a20ddadeb2036f0519d91e06537b8c3d9584d825a1d78798be7154f738acb28\": container with ID starting with 0a20ddadeb2036f0519d91e06537b8c3d9584d825a1d78798be7154f738acb28 not found: ID does not exist" containerID="0a20ddadeb2036f0519d91e06537b8c3d9584d825a1d78798be7154f738acb28" Apr 16 22:18:00.590336 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.590138 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a20ddadeb2036f0519d91e06537b8c3d9584d825a1d78798be7154f738acb28"} err="failed to get container status \"0a20ddadeb2036f0519d91e06537b8c3d9584d825a1d78798be7154f738acb28\": rpc error: code = NotFound desc = could not find container \"0a20ddadeb2036f0519d91e06537b8c3d9584d825a1d78798be7154f738acb28\": container with ID starting with 0a20ddadeb2036f0519d91e06537b8c3d9584d825a1d78798be7154f738acb28 not found: ID does not exist" Apr 16 22:18:00.590336 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.590160 2571 scope.go:117] "RemoveContainer" containerID="b90309d4b4f72edfeeee58cc5d09c2031e4954cd1690e621643a1142ac965643" Apr 16 22:18:00.590692 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:18:00.590414 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b90309d4b4f72edfeeee58cc5d09c2031e4954cd1690e621643a1142ac965643\": container with ID starting with b90309d4b4f72edfeeee58cc5d09c2031e4954cd1690e621643a1142ac965643 not found: ID does not exist" containerID="b90309d4b4f72edfeeee58cc5d09c2031e4954cd1690e621643a1142ac965643" Apr 16 22:18:00.590692 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.590439 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b90309d4b4f72edfeeee58cc5d09c2031e4954cd1690e621643a1142ac965643"} err="failed to get container status \"b90309d4b4f72edfeeee58cc5d09c2031e4954cd1690e621643a1142ac965643\": rpc error: code = NotFound desc = could not find container \"b90309d4b4f72edfeeee58cc5d09c2031e4954cd1690e621643a1142ac965643\": container with ID starting with b90309d4b4f72edfeeee58cc5d09c2031e4954cd1690e621643a1142ac965643 not found: ID does not exist" Apr 16 22:18:00.590692 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.590452 2571 scope.go:117] "RemoveContainer" containerID="47f611f8a4ed6e0520d4c14e95e9f41b03bdeb4ec9805f8d505c1adffc5c223a" Apr 16 22:18:00.590791 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:18:00.590713 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f611f8a4ed6e0520d4c14e95e9f41b03bdeb4ec9805f8d505c1adffc5c223a\": container with ID starting with 47f611f8a4ed6e0520d4c14e95e9f41b03bdeb4ec9805f8d505c1adffc5c223a not found: ID does not exist" containerID="47f611f8a4ed6e0520d4c14e95e9f41b03bdeb4ec9805f8d505c1adffc5c223a" Apr 16 22:18:00.590791 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.590739 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f611f8a4ed6e0520d4c14e95e9f41b03bdeb4ec9805f8d505c1adffc5c223a"} err="failed to get container status \"47f611f8a4ed6e0520d4c14e95e9f41b03bdeb4ec9805f8d505c1adffc5c223a\": rpc error: code = NotFound desc = could not find container \"47f611f8a4ed6e0520d4c14e95e9f41b03bdeb4ec9805f8d505c1adffc5c223a\": container with ID starting with 47f611f8a4ed6e0520d4c14e95e9f41b03bdeb4ec9805f8d505c1adffc5c223a not found: ID does not exist" Apr 16 22:18:00.594396 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.594383 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.597001 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.596976 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 22:18:00.597001 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.596993 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-9ph9r\"" Apr 16 22:18:00.597143 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.596978 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 22:18:00.597143 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.596979 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 22:18:00.597344 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.597328 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 22:18:00.598146 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.597852 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 22:18:00.598146 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.597860 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 22:18:00.598445 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.598413 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 22:18:00.603530 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.598573 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 22:18:00.606115 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.606094 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 22:18:00.607037 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.607019 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:18:00.725335 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.725295 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-config-volume\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.725335 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.725334 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03b733a7-476c-4e64-86e3-41c50662d4d1-config-out\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.725503 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.725353 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/03b733a7-476c-4e64-86e3-41c50662d4d1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.725503 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.725421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46dt5\" (UniqueName: \"kubernetes.io/projected/03b733a7-476c-4e64-86e3-41c50662d4d1-kube-api-access-46dt5\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.725503 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.725462 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-web-config\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.725503 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.725480 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03b733a7-476c-4e64-86e3-41c50662d4d1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.725503 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.725498 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.725695 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.725592 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.725695 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.725612 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b733a7-476c-4e64-86e3-41c50662d4d1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.725695 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.725664 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.725792 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.725711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03b733a7-476c-4e64-86e3-41c50662d4d1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.725792 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.725734 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.725792 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.725751 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.826856 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.826768 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.826856 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.826825 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.826856 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.826849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b733a7-476c-4e64-86e3-41c50662d4d1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.827090 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.826877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.827090 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.826918 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03b733a7-476c-4e64-86e3-41c50662d4d1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.827090 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.826963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.827090 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.826990 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.827090 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.827014 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-config-volume\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.827885 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.827860 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b733a7-476c-4e64-86e3-41c50662d4d1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.828086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.828065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03b733a7-476c-4e64-86e3-41c50662d4d1-config-out\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.828223 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.828202 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/03b733a7-476c-4e64-86e3-41c50662d4d1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.828341 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.828326 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46dt5\" (UniqueName: \"kubernetes.io/projected/03b733a7-476c-4e64-86e3-41c50662d4d1-kube-api-access-46dt5\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.828456 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.828441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-web-config\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.828580 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.828565 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03b733a7-476c-4e64-86e3-41c50662d4d1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.829373 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.829352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03b733a7-476c-4e64-86e3-41c50662d4d1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.830245 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.829937 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.830245 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.830003 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.830245 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.830174 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.830245 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.830198 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-config-volume\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.830245 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.830198 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.830476 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.830295 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/03b733a7-476c-4e64-86e3-41c50662d4d1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.830476 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.830367 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.830622 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.830602 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03b733a7-476c-4e64-86e3-41c50662d4d1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.831475 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.831453 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03b733a7-476c-4e64-86e3-41c50662d4d1-config-out\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.832224 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.832204 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03b733a7-476c-4e64-86e3-41c50662d4d1-web-config\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.838225 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.838204 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46dt5\" (UniqueName: \"kubernetes.io/projected/03b733a7-476c-4e64-86e3-41c50662d4d1-kube-api-access-46dt5\") pod \"alertmanager-main-0\" (UID: \"03b733a7-476c-4e64-86e3-41c50662d4d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:00.909007 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:00.908971 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:18:01.031288 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:01.031263 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:18:01.033959 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:18:01.033929 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03b733a7_476c_4e64_86e3_41c50662d4d1.slice/crio-36428ba00f99c6d217870c0885d8414bdd1d18c641a0b88dc50a528175d1c2e8 WatchSource:0}: Error finding container 36428ba00f99c6d217870c0885d8414bdd1d18c641a0b88dc50a528175d1c2e8: Status 404 returned error can't find the container with id 36428ba00f99c6d217870c0885d8414bdd1d18c641a0b88dc50a528175d1c2e8 Apr 16 22:18:01.539051 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:01.539014 2571 generic.go:358] "Generic (PLEG): container finished" podID="03b733a7-476c-4e64-86e3-41c50662d4d1" containerID="25ac338a9a1eafc21d390a44c402a52b64f762ac8781a24b0d09b9be36ebf89a" exitCode=0 Apr 16 22:18:01.539418 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:01.539072 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03b733a7-476c-4e64-86e3-41c50662d4d1","Type":"ContainerDied","Data":"25ac338a9a1eafc21d390a44c402a52b64f762ac8781a24b0d09b9be36ebf89a"} Apr 16 22:18:01.539418 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:01.539094 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03b733a7-476c-4e64-86e3-41c50662d4d1","Type":"ContainerStarted","Data":"36428ba00f99c6d217870c0885d8414bdd1d18c641a0b88dc50a528175d1c2e8"} Apr 16 22:18:01.679148 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:01.679117 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2aef76a-b4c3-442a-ac34-144815b90018" path="/var/lib/kubelet/pods/d2aef76a-b4c3-442a-ac34-144815b90018/volumes" Apr 16 22:18:02.545385 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:02.545351 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03b733a7-476c-4e64-86e3-41c50662d4d1","Type":"ContainerStarted","Data":"dc14ce417344758ade08d6488d5c904a4270f187f64c16963d3c8bbc134d9719"} Apr 16 22:18:02.545385 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:02.545386 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03b733a7-476c-4e64-86e3-41c50662d4d1","Type":"ContainerStarted","Data":"e10b6449ebc85862f75ce7daa3cec05eaaa89901865bc9e04672913b537bbbbf"} Apr 16 22:18:02.545806 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:02.545396 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03b733a7-476c-4e64-86e3-41c50662d4d1","Type":"ContainerStarted","Data":"2017d4e4f0c7d947d6d7d4fe83d63317b360e442f65c63f59ec4d3c08e4ee9df"} Apr 16 22:18:02.545806 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:02.545405 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03b733a7-476c-4e64-86e3-41c50662d4d1","Type":"ContainerStarted","Data":"14740958d5f0b23aa30039b74daa5b633c00671cb147e381ce1fefe0d3cbda37"} Apr 16 22:18:02.545806 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:02.545413 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03b733a7-476c-4e64-86e3-41c50662d4d1","Type":"ContainerStarted","Data":"eb8308e90bda45ada586a0cabb0df245e31389f956dcb5c0e563d3411aee31c0"} Apr 16 22:18:02.545806 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:02.545424 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03b733a7-476c-4e64-86e3-41c50662d4d1","Type":"ContainerStarted","Data":"fa92b96f0c162556fef2d0d198fe30cd50e4ac94592509dd2d03c31d8fa9d1a1"} Apr 16 22:18:02.573870 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:02.573811 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.573791574 podStartE2EDuration="2.573791574s" podCreationTimestamp="2026-04-16 22:18:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:18:02.571850684 +0000 UTC m=+265.503175546" watchObservedRunningTime="2026-04-16 22:18:02.573791574 +0000 UTC m=+265.505116437" Apr 16 22:18:13.052068 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.052027 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cd9cfb597-xf4kk"] Apr 16 22:18:13.055905 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.055879 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.058431 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.058404 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:18:13.058696 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.058672 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:18:13.058696 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.058689 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7vb78\"" Apr 16 22:18:13.058855 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.058693 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:18:13.058855 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.058693 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:18:13.058855 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.058784 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:18:13.064239 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.064218 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:18:13.067138 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.067115 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cd9cfb597-xf4kk"] Apr 16 22:18:13.225287 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.225257 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-trusted-ca-bundle\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.225287 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.225291 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trkl9\" (UniqueName: \"kubernetes.io/projected/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-kube-api-access-trkl9\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.225499 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.225311 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-config\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.225499 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.225334 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-oauth-config\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.225499 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.225412 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-oauth-serving-cert\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.225499 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.225476 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-service-ca\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.225645 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.225510 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-serving-cert\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.326670 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.326589 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-oauth-serving-cert\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.326670 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.326644 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-service-ca\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.326869 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.326688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-serving-cert\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.326869 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.326720 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-trusted-ca-bundle\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.326869 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.326743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trkl9\" (UniqueName: \"kubernetes.io/projected/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-kube-api-access-trkl9\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.326869 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.326769 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-config\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.326869 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.326794 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-oauth-config\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.327429 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.327401 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-service-ca\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.327542 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.327429 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-oauth-serving-cert\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.327542 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.327522 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-config\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.327809 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.327787 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-trusted-ca-bundle\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.329189 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.329172 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-serving-cert\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.329490 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.329469 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-oauth-config\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.335108 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.335088 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trkl9\" (UniqueName: \"kubernetes.io/projected/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-kube-api-access-trkl9\") pod \"console-6cd9cfb597-xf4kk\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.368972 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.368949 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:13.489338 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.489314 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cd9cfb597-xf4kk"] Apr 16 22:18:13.491850 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:18:13.491814 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a3dbfbd_d2e3_45f6_8cc6_0ccc8df37b44.slice/crio-47ddcc63845fefd2e546f2acda31f06c2701626dc3d32da6b59ccaef21b40bd9 WatchSource:0}: Error finding container 47ddcc63845fefd2e546f2acda31f06c2701626dc3d32da6b59ccaef21b40bd9: Status 404 returned error can't find the container with id 47ddcc63845fefd2e546f2acda31f06c2701626dc3d32da6b59ccaef21b40bd9 Apr 16 22:18:13.581115 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.581042 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cd9cfb597-xf4kk" event={"ID":"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44","Type":"ContainerStarted","Data":"0455061c9d9631bec01e2e0e9ab558c9e26922291a505c3aa80f7e72bca7bfab"} Apr 16 22:18:13.581115 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.581075 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cd9cfb597-xf4kk" event={"ID":"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44","Type":"ContainerStarted","Data":"47ddcc63845fefd2e546f2acda31f06c2701626dc3d32da6b59ccaef21b40bd9"} Apr 16 22:18:13.621896 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:13.621826 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cd9cfb597-xf4kk" podStartSLOduration=0.621811428 podStartE2EDuration="621.811428ms" podCreationTimestamp="2026-04-16 22:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:18:13.620710804 +0000 UTC m=+276.552035677" watchObservedRunningTime="2026-04-16 22:18:13.621811428 +0000 UTC m=+276.553136290" Apr 16 22:18:23.369451 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:23.369413 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:23.369451 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:23.369457 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:23.374008 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:23.373981 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:23.617191 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:23.617160 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:18:37.545059 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:37.545028 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/1.log" Apr 16 22:18:37.545649 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:37.545344 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/1.log" Apr 16 22:18:37.554202 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:37.554182 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:18:53.587829 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.587790 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht"] Apr 16 22:18:53.592972 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.592941 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" Apr 16 22:18:53.595752 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.595729 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:18:53.596501 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.596465 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:18:53.596641 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.596610 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8gr2s\"" Apr 16 22:18:53.597096 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.597076 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht"] Apr 16 22:18:53.635143 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.635117 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht\" (UID: \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" Apr 16 22:18:53.635260 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.635156 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2h42\" (UniqueName: \"kubernetes.io/projected/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-kube-api-access-b2h42\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht\" (UID: \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" Apr 16 22:18:53.635307 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.635255 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht\" (UID: \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" Apr 16 22:18:53.736511 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.736479 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht\" (UID: \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" Apr 16 22:18:53.736676 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.736530 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht\" (UID: \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" Apr 16 22:18:53.736676 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.736581 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2h42\" (UniqueName: \"kubernetes.io/projected/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-kube-api-access-b2h42\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht\" (UID: \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" Apr 16 22:18:53.736861 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.736843 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht\" (UID: \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" Apr 16 22:18:53.736948 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.736929 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht\" (UID: \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" Apr 16 22:18:53.746140 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.746117 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2h42\" (UniqueName: \"kubernetes.io/projected/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-kube-api-access-b2h42\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht\" (UID: \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" Apr 16 22:18:53.902497 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:53.902436 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" Apr 16 22:18:54.017183 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:54.017152 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht"] Apr 16 22:18:54.020086 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:18:54.020059 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c29f37a_ab1e_4a57_bc7e_cd154b4366dd.slice/crio-6b30b0ce9579421773daf84a9facd5828262ee600cb2071e567fb379959bff48 WatchSource:0}: Error finding container 6b30b0ce9579421773daf84a9facd5828262ee600cb2071e567fb379959bff48: Status 404 returned error can't find the container with id 6b30b0ce9579421773daf84a9facd5828262ee600cb2071e567fb379959bff48 Apr 16 22:18:54.021707 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:54.021688 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:18:54.707769 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:54.707721 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" event={"ID":"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd","Type":"ContainerStarted","Data":"6b30b0ce9579421773daf84a9facd5828262ee600cb2071e567fb379959bff48"} Apr 16 22:18:59.723511 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:59.723420 2571 generic.go:358] "Generic (PLEG): container finished" podID="0c29f37a-ab1e-4a57-bc7e-cd154b4366dd" containerID="c77093a09acc8f15044fa1491aa5d563062907df7aa6ac244f8917b56d4a84aa" exitCode=0 Apr 16 22:18:59.723961 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:18:59.723509 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" event={"ID":"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd","Type":"ContainerDied","Data":"c77093a09acc8f15044fa1491aa5d563062907df7aa6ac244f8917b56d4a84aa"} Apr 16 22:19:02.732928 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:02.732887 2571 generic.go:358] "Generic (PLEG): container finished" podID="0c29f37a-ab1e-4a57-bc7e-cd154b4366dd" containerID="9c5e05f054f06cfe5c633ae91fca767bd2f2bfc73103caa500e7048d9eec669b" exitCode=0 Apr 16 22:19:02.733398 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:02.732975 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" event={"ID":"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd","Type":"ContainerDied","Data":"9c5e05f054f06cfe5c633ae91fca767bd2f2bfc73103caa500e7048d9eec669b"} Apr 16 22:19:10.758696 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:10.758655 2571 generic.go:358] "Generic (PLEG): container finished" podID="0c29f37a-ab1e-4a57-bc7e-cd154b4366dd" containerID="b0d302bcf97f3aa7f818a84d967451b2e68d1b8a57a1171893f0c66cbb2b05c4" exitCode=0 Apr 16 22:19:10.759063 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:10.758708 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" event={"ID":"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd","Type":"ContainerDied","Data":"b0d302bcf97f3aa7f818a84d967451b2e68d1b8a57a1171893f0c66cbb2b05c4"} Apr 16 22:19:11.879031 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:11.879006 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" Apr 16 22:19:11.992915 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:11.992876 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-bundle\") pod \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\" (UID: \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\") " Apr 16 22:19:11.993044 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:11.992968 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2h42\" (UniqueName: \"kubernetes.io/projected/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-kube-api-access-b2h42\") pod \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\" (UID: \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\") " Apr 16 22:19:11.993044 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:11.992987 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-util\") pod \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\" (UID: \"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd\") " Apr 16 22:19:11.993493 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:11.993465 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-bundle" (OuterVolumeSpecName: "bundle") pod "0c29f37a-ab1e-4a57-bc7e-cd154b4366dd" (UID: "0c29f37a-ab1e-4a57-bc7e-cd154b4366dd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:19:11.995084 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:11.995063 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-kube-api-access-b2h42" (OuterVolumeSpecName: "kube-api-access-b2h42") pod "0c29f37a-ab1e-4a57-bc7e-cd154b4366dd" (UID: "0c29f37a-ab1e-4a57-bc7e-cd154b4366dd"). InnerVolumeSpecName "kube-api-access-b2h42". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:19:11.997252 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:11.997231 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-util" (OuterVolumeSpecName: "util") pod "0c29f37a-ab1e-4a57-bc7e-cd154b4366dd" (UID: "0c29f37a-ab1e-4a57-bc7e-cd154b4366dd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:19:12.093702 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:12.093634 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b2h42\" (UniqueName: \"kubernetes.io/projected/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-kube-api-access-b2h42\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:19:12.093702 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:12.093667 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-util\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:19:12.093702 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:12.093678 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c29f37a-ab1e-4a57-bc7e-cd154b4366dd-bundle\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:19:12.766048 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:12.766016 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" Apr 16 22:19:12.766048 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:12.766023 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cz62ht" event={"ID":"0c29f37a-ab1e-4a57-bc7e-cd154b4366dd","Type":"ContainerDied","Data":"6b30b0ce9579421773daf84a9facd5828262ee600cb2071e567fb379959bff48"} Apr 16 22:19:12.766295 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:12.766057 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b30b0ce9579421773daf84a9facd5828262ee600cb2071e567fb379959bff48" Apr 16 22:19:15.480272 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.480241 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv"] Apr 16 22:19:15.480652 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.480535 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c29f37a-ab1e-4a57-bc7e-cd154b4366dd" containerName="util" Apr 16 22:19:15.480652 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.480561 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c29f37a-ab1e-4a57-bc7e-cd154b4366dd" containerName="util" Apr 16 22:19:15.480652 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.480578 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c29f37a-ab1e-4a57-bc7e-cd154b4366dd" containerName="extract" Apr 16 22:19:15.480652 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.480586 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c29f37a-ab1e-4a57-bc7e-cd154b4366dd" containerName="extract" Apr 16 22:19:15.480652 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.480600 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c29f37a-ab1e-4a57-bc7e-cd154b4366dd" containerName="pull" Apr 16 22:19:15.480652 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.480605 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c29f37a-ab1e-4a57-bc7e-cd154b4366dd" containerName="pull" Apr 16 22:19:15.480856 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.480663 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c29f37a-ab1e-4a57-bc7e-cd154b4366dd" containerName="extract" Apr 16 22:19:15.532521 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.532488 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv"] Apr 16 22:19:15.532685 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.532597 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv" Apr 16 22:19:15.535188 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.535165 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 22:19:15.535282 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.535192 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 22:19:15.535282 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.535192 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 22:19:15.535282 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.535250 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-dn9cd\"" Apr 16 22:19:15.621151 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.621123 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/189600de-cfd0-4aad-845b-b714cadce85e-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv\" (UID: \"189600de-cfd0-4aad-845b-b714cadce85e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv" Apr 16 22:19:15.621292 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.621182 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbvrn\" (UniqueName: \"kubernetes.io/projected/189600de-cfd0-4aad-845b-b714cadce85e-kube-api-access-nbvrn\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv\" (UID: \"189600de-cfd0-4aad-845b-b714cadce85e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv" Apr 16 22:19:15.721865 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.721818 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/189600de-cfd0-4aad-845b-b714cadce85e-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv\" (UID: \"189600de-cfd0-4aad-845b-b714cadce85e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv" Apr 16 22:19:15.722048 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.721920 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbvrn\" (UniqueName: \"kubernetes.io/projected/189600de-cfd0-4aad-845b-b714cadce85e-kube-api-access-nbvrn\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv\" (UID: \"189600de-cfd0-4aad-845b-b714cadce85e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv" Apr 16 22:19:15.728886 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.728850 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/189600de-cfd0-4aad-845b-b714cadce85e-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv\" (UID: \"189600de-cfd0-4aad-845b-b714cadce85e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv" Apr 16 22:19:15.730822 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.730764 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbvrn\" (UniqueName: \"kubernetes.io/projected/189600de-cfd0-4aad-845b-b714cadce85e-kube-api-access-nbvrn\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv\" (UID: \"189600de-cfd0-4aad-845b-b714cadce85e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv" Apr 16 22:19:15.842809 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.842779 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv" Apr 16 22:19:15.966352 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:15.966327 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv"] Apr 16 22:19:15.970600 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:19:15.970534 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod189600de_cfd0_4aad_845b_b714cadce85e.slice/crio-bb74b2ff129bad6c334ecde8c94ca5ee4424e5a81e7c1f080653cee03d70a759 WatchSource:0}: Error finding container bb74b2ff129bad6c334ecde8c94ca5ee4424e5a81e7c1f080653cee03d70a759: Status 404 returned error can't find the container with id bb74b2ff129bad6c334ecde8c94ca5ee4424e5a81e7c1f080653cee03d70a759 Apr 16 22:19:16.778655 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:16.778624 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv" event={"ID":"189600de-cfd0-4aad-845b-b714cadce85e","Type":"ContainerStarted","Data":"bb74b2ff129bad6c334ecde8c94ca5ee4424e5a81e7c1f080653cee03d70a759"} Apr 16 22:19:20.514370 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.514340 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-cw8zw"] Apr 16 22:19:20.517506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.517487 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:20.520322 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.520305 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-8p7wf\"" Apr 16 22:19:20.520600 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.520584 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 22:19:20.520911 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.520892 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 22:19:20.537262 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.537240 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-cw8zw"] Apr 16 22:19:20.663366 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.663332 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-certificates\") pod \"keda-operator-ffbb595cb-cw8zw\" (UID: \"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1\") " pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:20.663366 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.663364 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvntt\" (UniqueName: \"kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-kube-api-access-wvntt\") pod \"keda-operator-ffbb595cb-cw8zw\" (UID: \"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1\") " pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:20.663601 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.663423 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-cabundle0\") pod \"keda-operator-ffbb595cb-cw8zw\" (UID: \"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1\") " pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:20.764068 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.764035 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-certificates\") pod \"keda-operator-ffbb595cb-cw8zw\" (UID: \"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1\") " pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:20.764068 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.764072 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvntt\" (UniqueName: \"kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-kube-api-access-wvntt\") pod \"keda-operator-ffbb595cb-cw8zw\" (UID: \"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1\") " pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:20.764306 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.764127 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-cabundle0\") pod \"keda-operator-ffbb595cb-cw8zw\" (UID: \"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1\") " pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:20.764306 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:20.764191 2571 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 22:19:20.764306 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:20.764211 2571 secret.go:281] references non-existent secret key: ca.crt Apr 16 22:19:20.764306 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:20.764221 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 22:19:20.764306 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:20.764238 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-cw8zw: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 22:19:20.764306 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:20.764308 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-certificates podName:c2269c5d-2a41-4f6e-a497-dacdd9ce17b1 nodeName:}" failed. No retries permitted until 2026-04-16 22:19:21.264285889 +0000 UTC m=+344.195610752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-certificates") pod "keda-operator-ffbb595cb-cw8zw" (UID: "c2269c5d-2a41-4f6e-a497-dacdd9ce17b1") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 22:19:20.764949 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.764902 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-cabundle0\") pod \"keda-operator-ffbb595cb-cw8zw\" (UID: \"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1\") " pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:20.781170 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.781144 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvntt\" (UniqueName: \"kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-kube-api-access-wvntt\") pod \"keda-operator-ffbb595cb-cw8zw\" (UID: \"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1\") " pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:20.795350 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.795319 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv" event={"ID":"189600de-cfd0-4aad-845b-b714cadce85e","Type":"ContainerStarted","Data":"1594a4299c484a0b81c4815d2181298b08749452d17b2fe2fd3d93d74cd94d9d"} Apr 16 22:19:20.795490 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.795474 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv" Apr 16 22:19:20.827528 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.827484 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv" podStartSLOduration=1.856463884 podStartE2EDuration="5.827473501s" podCreationTimestamp="2026-04-16 22:19:15 +0000 UTC" firstStartedPulling="2026-04-16 22:19:15.971910646 +0000 UTC m=+338.903235490" lastFinishedPulling="2026-04-16 22:19:19.942920264 +0000 UTC m=+342.874245107" observedRunningTime="2026-04-16 22:19:20.825426786 +0000 UTC m=+343.756751648" watchObservedRunningTime="2026-04-16 22:19:20.827473501 +0000 UTC m=+343.758798362" Apr 16 22:19:20.884094 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.884065 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh"] Apr 16 22:19:20.887558 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.887532 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:20.890293 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.890271 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 22:19:20.897772 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:20.897750 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh"] Apr 16 22:19:21.067799 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.067711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br745\" (UniqueName: \"kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-kube-api-access-br745\") pod \"keda-metrics-apiserver-7c9f485588-5zkjh\" (UID: \"ae15850c-4a54-4c15-af80-bec8758407c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:21.067799 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.067747 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ae15850c-4a54-4c15-af80-bec8758407c6-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5zkjh\" (UID: \"ae15850c-4a54-4c15-af80-bec8758407c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:21.067982 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.067829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5zkjh\" (UID: \"ae15850c-4a54-4c15-af80-bec8758407c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:21.168393 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.168362 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ae15850c-4a54-4c15-af80-bec8758407c6-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5zkjh\" (UID: \"ae15850c-4a54-4c15-af80-bec8758407c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:21.168623 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.168545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5zkjh\" (UID: \"ae15850c-4a54-4c15-af80-bec8758407c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:21.168711 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.168640 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-br745\" (UniqueName: \"kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-kube-api-access-br745\") pod \"keda-metrics-apiserver-7c9f485588-5zkjh\" (UID: \"ae15850c-4a54-4c15-af80-bec8758407c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:21.168711 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:21.168651 2571 secret.go:281] references non-existent secret key: tls.crt Apr 16 22:19:21.168711 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:21.168675 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 22:19:21.168711 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:21.168699 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh: references non-existent secret key: tls.crt Apr 16 22:19:21.168890 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:21.168761 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-certificates podName:ae15850c-4a54-4c15-af80-bec8758407c6 nodeName:}" failed. No retries permitted until 2026-04-16 22:19:21.668741239 +0000 UTC m=+344.600066079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-certificates") pod "keda-metrics-apiserver-7c9f485588-5zkjh" (UID: "ae15850c-4a54-4c15-af80-bec8758407c6") : references non-existent secret key: tls.crt Apr 16 22:19:21.168890 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.168760 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/ae15850c-4a54-4c15-af80-bec8758407c6-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5zkjh\" (UID: \"ae15850c-4a54-4c15-af80-bec8758407c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:21.179914 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.179891 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-br745\" (UniqueName: \"kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-kube-api-access-br745\") pod \"keda-metrics-apiserver-7c9f485588-5zkjh\" (UID: \"ae15850c-4a54-4c15-af80-bec8758407c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:21.196677 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.196649 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-7h7j8"] Apr 16 22:19:21.200166 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.200151 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-7h7j8" Apr 16 22:19:21.202580 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.202543 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 22:19:21.208714 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.208694 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-7h7j8"] Apr 16 22:19:21.269770 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.269745 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-certificates\") pod \"keda-operator-ffbb595cb-cw8zw\" (UID: \"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1\") " pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:21.269913 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:21.269895 2571 secret.go:281] references non-existent secret key: ca.crt Apr 16 22:19:21.269956 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:21.269917 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 22:19:21.269956 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:21.269927 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-cw8zw: references non-existent secret key: ca.crt Apr 16 22:19:21.270027 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:21.269975 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-certificates podName:c2269c5d-2a41-4f6e-a497-dacdd9ce17b1 nodeName:}" failed. No retries permitted until 2026-04-16 22:19:22.269959088 +0000 UTC m=+345.201283929 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-certificates") pod "keda-operator-ffbb595cb-cw8zw" (UID: "c2269c5d-2a41-4f6e-a497-dacdd9ce17b1") : references non-existent secret key: ca.crt Apr 16 22:19:21.370903 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.370826 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/17a09fff-57bf-47d3-a41f-2c2a50df4929-certificates\") pod \"keda-admission-cf49989db-7h7j8\" (UID: \"17a09fff-57bf-47d3-a41f-2c2a50df4929\") " pod="openshift-keda/keda-admission-cf49989db-7h7j8" Apr 16 22:19:21.370903 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.370856 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x65vv\" (UniqueName: \"kubernetes.io/projected/17a09fff-57bf-47d3-a41f-2c2a50df4929-kube-api-access-x65vv\") pod \"keda-admission-cf49989db-7h7j8\" (UID: \"17a09fff-57bf-47d3-a41f-2c2a50df4929\") " pod="openshift-keda/keda-admission-cf49989db-7h7j8" Apr 16 22:19:21.471814 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.471779 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/17a09fff-57bf-47d3-a41f-2c2a50df4929-certificates\") pod \"keda-admission-cf49989db-7h7j8\" (UID: \"17a09fff-57bf-47d3-a41f-2c2a50df4929\") " pod="openshift-keda/keda-admission-cf49989db-7h7j8" Apr 16 22:19:21.471967 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.471820 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x65vv\" (UniqueName: \"kubernetes.io/projected/17a09fff-57bf-47d3-a41f-2c2a50df4929-kube-api-access-x65vv\") pod \"keda-admission-cf49989db-7h7j8\" (UID: \"17a09fff-57bf-47d3-a41f-2c2a50df4929\") " pod="openshift-keda/keda-admission-cf49989db-7h7j8" Apr 16 22:19:21.474390 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.474365 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/17a09fff-57bf-47d3-a41f-2c2a50df4929-certificates\") pod \"keda-admission-cf49989db-7h7j8\" (UID: \"17a09fff-57bf-47d3-a41f-2c2a50df4929\") " pod="openshift-keda/keda-admission-cf49989db-7h7j8" Apr 16 22:19:21.483560 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.483520 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x65vv\" (UniqueName: \"kubernetes.io/projected/17a09fff-57bf-47d3-a41f-2c2a50df4929-kube-api-access-x65vv\") pod \"keda-admission-cf49989db-7h7j8\" (UID: \"17a09fff-57bf-47d3-a41f-2c2a50df4929\") " pod="openshift-keda/keda-admission-cf49989db-7h7j8" Apr 16 22:19:21.511500 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.511475 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-7h7j8" Apr 16 22:19:21.629256 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.629216 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-7h7j8"] Apr 16 22:19:21.673705 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.673678 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5zkjh\" (UID: \"ae15850c-4a54-4c15-af80-bec8758407c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:21.673839 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:21.673815 2571 secret.go:281] references non-existent secret key: tls.crt Apr 16 22:19:21.673839 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:21.673832 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 22:19:21.673949 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:21.673855 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh: references non-existent secret key: tls.crt Apr 16 22:19:21.673949 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:21.673903 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-certificates podName:ae15850c-4a54-4c15-af80-bec8758407c6 nodeName:}" failed. No retries permitted until 2026-04-16 22:19:22.673887824 +0000 UTC m=+345.605212664 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-certificates") pod "keda-metrics-apiserver-7c9f485588-5zkjh" (UID: "ae15850c-4a54-4c15-af80-bec8758407c6") : references non-existent secret key: tls.crt Apr 16 22:19:21.799735 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:21.799699 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-7h7j8" event={"ID":"17a09fff-57bf-47d3-a41f-2c2a50df4929","Type":"ContainerStarted","Data":"e875386107ae271115c011b937f3c18334c562c2b64c92b0d2680adfb5bd911c"} Apr 16 22:19:22.279335 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:22.279301 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-certificates\") pod \"keda-operator-ffbb595cb-cw8zw\" (UID: \"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1\") " pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:22.279538 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:22.279436 2571 secret.go:281] references non-existent secret key: ca.crt Apr 16 22:19:22.279538 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:22.279453 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 22:19:22.279538 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:22.279465 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-cw8zw: references non-existent secret key: ca.crt Apr 16 22:19:22.279538 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:22.279521 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-certificates podName:c2269c5d-2a41-4f6e-a497-dacdd9ce17b1 nodeName:}" failed. No retries permitted until 2026-04-16 22:19:24.279504412 +0000 UTC m=+347.210829255 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-certificates") pod "keda-operator-ffbb595cb-cw8zw" (UID: "c2269c5d-2a41-4f6e-a497-dacdd9ce17b1") : references non-existent secret key: ca.crt Apr 16 22:19:22.683667 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:22.683578 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5zkjh\" (UID: \"ae15850c-4a54-4c15-af80-bec8758407c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:22.684093 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:22.683728 2571 secret.go:281] references non-existent secret key: tls.crt Apr 16 22:19:22.684093 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:22.683745 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 22:19:22.684093 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:22.683766 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh: references non-existent secret key: tls.crt Apr 16 22:19:22.684093 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:19:22.683819 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-certificates podName:ae15850c-4a54-4c15-af80-bec8758407c6 nodeName:}" failed. No retries permitted until 2026-04-16 22:19:24.683805014 +0000 UTC m=+347.615129854 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-certificates") pod "keda-metrics-apiserver-7c9f485588-5zkjh" (UID: "ae15850c-4a54-4c15-af80-bec8758407c6") : references non-existent secret key: tls.crt Apr 16 22:19:23.809110 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:23.809076 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-7h7j8" event={"ID":"17a09fff-57bf-47d3-a41f-2c2a50df4929","Type":"ContainerStarted","Data":"4b7385098ad375d9a8c6b180e110974db602bb90b2e54e7b2e4428880429d5b3"} Apr 16 22:19:23.809584 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:23.809138 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-7h7j8" Apr 16 22:19:23.826898 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:23.826857 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-7h7j8" podStartSLOduration=1.490546419 podStartE2EDuration="2.826842237s" podCreationTimestamp="2026-04-16 22:19:21 +0000 UTC" firstStartedPulling="2026-04-16 22:19:21.63542662 +0000 UTC m=+344.566751459" lastFinishedPulling="2026-04-16 22:19:22.971722399 +0000 UTC m=+345.903047277" observedRunningTime="2026-04-16 22:19:23.825292167 +0000 UTC m=+346.756617029" watchObservedRunningTime="2026-04-16 22:19:23.826842237 +0000 UTC m=+346.758167099" Apr 16 22:19:24.297039 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:24.297006 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-certificates\") pod \"keda-operator-ffbb595cb-cw8zw\" (UID: \"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1\") " pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:24.299369 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:24.299342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c2269c5d-2a41-4f6e-a497-dacdd9ce17b1-certificates\") pod \"keda-operator-ffbb595cb-cw8zw\" (UID: \"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1\") " pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:24.428104 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:24.428076 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:24.542492 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:24.542467 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-cw8zw"] Apr 16 22:19:24.544070 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:19:24.544041 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2269c5d_2a41_4f6e_a497_dacdd9ce17b1.slice/crio-b17f4a57990850594df7862fd83ae3950d6363c9191fd93812add6f5fc76e5b8 WatchSource:0}: Error finding container b17f4a57990850594df7862fd83ae3950d6363c9191fd93812add6f5fc76e5b8: Status 404 returned error can't find the container with id b17f4a57990850594df7862fd83ae3950d6363c9191fd93812add6f5fc76e5b8 Apr 16 22:19:24.700876 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:24.700785 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5zkjh\" (UID: \"ae15850c-4a54-4c15-af80-bec8758407c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:24.703376 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:24.703355 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ae15850c-4a54-4c15-af80-bec8758407c6-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5zkjh\" (UID: \"ae15850c-4a54-4c15-af80-bec8758407c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:24.800066 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:24.800028 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:24.813533 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:24.813508 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" event={"ID":"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1","Type":"ContainerStarted","Data":"b17f4a57990850594df7862fd83ae3950d6363c9191fd93812add6f5fc76e5b8"} Apr 16 22:19:24.914245 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:24.914219 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh"] Apr 16 22:19:24.916191 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:19:24.916165 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae15850c_4a54_4c15_af80_bec8758407c6.slice/crio-d10d0a6c69a046df72668f63fe79c0f99c592d4d5e762389faf5dad7bc551f99 WatchSource:0}: Error finding container d10d0a6c69a046df72668f63fe79c0f99c592d4d5e762389faf5dad7bc551f99: Status 404 returned error can't find the container with id d10d0a6c69a046df72668f63fe79c0f99c592d4d5e762389faf5dad7bc551f99 Apr 16 22:19:25.818325 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:25.818287 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" event={"ID":"ae15850c-4a54-4c15-af80-bec8758407c6","Type":"ContainerStarted","Data":"d10d0a6c69a046df72668f63fe79c0f99c592d4d5e762389faf5dad7bc551f99"} Apr 16 22:19:28.830372 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:28.830334 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" event={"ID":"c2269c5d-2a41-4f6e-a497-dacdd9ce17b1","Type":"ContainerStarted","Data":"61bc8c5b18f5419b28069c318f1c62ede40edc560c377ce9027bfff4f64b598b"} Apr 16 22:19:28.830859 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:28.830419 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:19:28.831747 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:28.831727 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" event={"ID":"ae15850c-4a54-4c15-af80-bec8758407c6","Type":"ContainerStarted","Data":"5634b251649662947596a1c70c20f21e23e49b53a3ae43cf518038bd79f413d3"} Apr 16 22:19:28.831877 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:28.831849 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:28.847634 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:28.847580 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" podStartSLOduration=4.969275705 podStartE2EDuration="8.847569576s" podCreationTimestamp="2026-04-16 22:19:20 +0000 UTC" firstStartedPulling="2026-04-16 22:19:24.54545082 +0000 UTC m=+347.476775662" lastFinishedPulling="2026-04-16 22:19:28.423744693 +0000 UTC m=+351.355069533" observedRunningTime="2026-04-16 22:19:28.84689337 +0000 UTC m=+351.778218231" watchObservedRunningTime="2026-04-16 22:19:28.847569576 +0000 UTC m=+351.778894438" Apr 16 22:19:28.864543 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:28.864505 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" podStartSLOduration=5.358394576 podStartE2EDuration="8.864494879s" podCreationTimestamp="2026-04-16 22:19:20 +0000 UTC" firstStartedPulling="2026-04-16 22:19:24.917468837 +0000 UTC m=+347.848793677" lastFinishedPulling="2026-04-16 22:19:28.4235691 +0000 UTC m=+351.354893980" observedRunningTime="2026-04-16 22:19:28.863221444 +0000 UTC m=+351.794546350" watchObservedRunningTime="2026-04-16 22:19:28.864494879 +0000 UTC m=+351.795819740" Apr 16 22:19:39.840366 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:39.840330 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5zkjh" Apr 16 22:19:41.802094 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:41.802067 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jdvtv" Apr 16 22:19:44.816023 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:44.815993 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-7h7j8" Apr 16 22:19:49.838147 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:19:49.838106 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-cw8zw" Apr 16 22:20:29.087706 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.087671 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-p68c7"] Apr 16 22:20:29.091126 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.091107 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" Apr 16 22:20:29.093823 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.093805 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 22:20:29.093823 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.093814 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 22:20:29.094737 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.094719 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 22:20:29.094846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.094721 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-wp2hg\"" Apr 16 22:20:29.099343 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.099321 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-p68c7"] Apr 16 22:20:29.134686 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.134662 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-h4xp9"] Apr 16 22:20:29.137168 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.137140 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vfgt\" (UniqueName: \"kubernetes.io/projected/c9bd0c67-e39a-4b4f-9187-87687d5dc182-kube-api-access-4vfgt\") pod \"kserve-controller-manager-84d7d5cfc6-p68c7\" (UID: \"c9bd0c67-e39a-4b4f-9187-87687d5dc182\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" Apr 16 22:20:29.137298 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.137264 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9bd0c67-e39a-4b4f-9187-87687d5dc182-cert\") pod \"kserve-controller-manager-84d7d5cfc6-p68c7\" (UID: \"c9bd0c67-e39a-4b4f-9187-87687d5dc182\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" Apr 16 22:20:29.137774 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.137759 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-h4xp9" Apr 16 22:20:29.140134 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.140113 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-fkqp7\"" Apr 16 22:20:29.140242 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.140114 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 22:20:29.147873 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.147854 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-h4xp9"] Apr 16 22:20:29.237675 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.237643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9bd0c67-e39a-4b4f-9187-87687d5dc182-cert\") pod \"kserve-controller-manager-84d7d5cfc6-p68c7\" (UID: \"c9bd0c67-e39a-4b4f-9187-87687d5dc182\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" Apr 16 22:20:29.237857 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.237680 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vfgt\" (UniqueName: \"kubernetes.io/projected/c9bd0c67-e39a-4b4f-9187-87687d5dc182-kube-api-access-4vfgt\") pod \"kserve-controller-manager-84d7d5cfc6-p68c7\" (UID: \"c9bd0c67-e39a-4b4f-9187-87687d5dc182\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" Apr 16 22:20:29.237857 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.237712 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/08407eaf-053b-482d-958c-ee5b0a4357bd-data\") pod \"seaweedfs-86cc847c5c-h4xp9\" (UID: \"08407eaf-053b-482d-958c-ee5b0a4357bd\") " pod="kserve/seaweedfs-86cc847c5c-h4xp9" Apr 16 22:20:29.237857 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.237763 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bpdm\" (UniqueName: \"kubernetes.io/projected/08407eaf-053b-482d-958c-ee5b0a4357bd-kube-api-access-5bpdm\") pod \"seaweedfs-86cc847c5c-h4xp9\" (UID: \"08407eaf-053b-482d-958c-ee5b0a4357bd\") " pod="kserve/seaweedfs-86cc847c5c-h4xp9" Apr 16 22:20:29.237857 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:20:29.237814 2571 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 22:20:29.238079 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:20:29.237879 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9bd0c67-e39a-4b4f-9187-87687d5dc182-cert podName:c9bd0c67-e39a-4b4f-9187-87687d5dc182 nodeName:}" failed. No retries permitted until 2026-04-16 22:20:29.737859793 +0000 UTC m=+412.669184638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c9bd0c67-e39a-4b4f-9187-87687d5dc182-cert") pod "kserve-controller-manager-84d7d5cfc6-p68c7" (UID: "c9bd0c67-e39a-4b4f-9187-87687d5dc182") : secret "kserve-webhook-server-cert" not found Apr 16 22:20:29.250016 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.249993 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vfgt\" (UniqueName: \"kubernetes.io/projected/c9bd0c67-e39a-4b4f-9187-87687d5dc182-kube-api-access-4vfgt\") pod \"kserve-controller-manager-84d7d5cfc6-p68c7\" (UID: \"c9bd0c67-e39a-4b4f-9187-87687d5dc182\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" Apr 16 22:20:29.338646 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.338535 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/08407eaf-053b-482d-958c-ee5b0a4357bd-data\") pod \"seaweedfs-86cc847c5c-h4xp9\" (UID: \"08407eaf-053b-482d-958c-ee5b0a4357bd\") " pod="kserve/seaweedfs-86cc847c5c-h4xp9" Apr 16 22:20:29.338646 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.338592 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bpdm\" (UniqueName: \"kubernetes.io/projected/08407eaf-053b-482d-958c-ee5b0a4357bd-kube-api-access-5bpdm\") pod \"seaweedfs-86cc847c5c-h4xp9\" (UID: \"08407eaf-053b-482d-958c-ee5b0a4357bd\") " pod="kserve/seaweedfs-86cc847c5c-h4xp9" Apr 16 22:20:29.338877 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.338858 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/08407eaf-053b-482d-958c-ee5b0a4357bd-data\") pod \"seaweedfs-86cc847c5c-h4xp9\" (UID: \"08407eaf-053b-482d-958c-ee5b0a4357bd\") " pod="kserve/seaweedfs-86cc847c5c-h4xp9" Apr 16 22:20:29.355907 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.355878 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bpdm\" (UniqueName: \"kubernetes.io/projected/08407eaf-053b-482d-958c-ee5b0a4357bd-kube-api-access-5bpdm\") pod \"seaweedfs-86cc847c5c-h4xp9\" (UID: \"08407eaf-053b-482d-958c-ee5b0a4357bd\") " pod="kserve/seaweedfs-86cc847c5c-h4xp9" Apr 16 22:20:29.447418 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.447386 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-h4xp9" Apr 16 22:20:29.581877 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.581846 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-h4xp9"] Apr 16 22:20:29.585201 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:20:29.585168 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08407eaf_053b_482d_958c_ee5b0a4357bd.slice/crio-9fdaa195b9edb8f95155dcca7fe3826bdad0b83ec5c58735a626f7a19df64083 WatchSource:0}: Error finding container 9fdaa195b9edb8f95155dcca7fe3826bdad0b83ec5c58735a626f7a19df64083: Status 404 returned error can't find the container with id 9fdaa195b9edb8f95155dcca7fe3826bdad0b83ec5c58735a626f7a19df64083 Apr 16 22:20:29.742204 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.742172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9bd0c67-e39a-4b4f-9187-87687d5dc182-cert\") pod \"kserve-controller-manager-84d7d5cfc6-p68c7\" (UID: \"c9bd0c67-e39a-4b4f-9187-87687d5dc182\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" Apr 16 22:20:29.744452 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:29.744427 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9bd0c67-e39a-4b4f-9187-87687d5dc182-cert\") pod \"kserve-controller-manager-84d7d5cfc6-p68c7\" (UID: \"c9bd0c67-e39a-4b4f-9187-87687d5dc182\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" Apr 16 22:20:30.002496 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:30.002472 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" Apr 16 22:20:30.025850 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:30.025820 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-h4xp9" event={"ID":"08407eaf-053b-482d-958c-ee5b0a4357bd","Type":"ContainerStarted","Data":"9fdaa195b9edb8f95155dcca7fe3826bdad0b83ec5c58735a626f7a19df64083"} Apr 16 22:20:30.144814 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:30.144781 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-p68c7"] Apr 16 22:20:30.147717 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:20:30.147678 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9bd0c67_e39a_4b4f_9187_87687d5dc182.slice/crio-f9d12ebe84b3294ca70d95986a803ef1caf3cbd70e24cd8cd69da924d06fb70b WatchSource:0}: Error finding container f9d12ebe84b3294ca70d95986a803ef1caf3cbd70e24cd8cd69da924d06fb70b: Status 404 returned error can't find the container with id f9d12ebe84b3294ca70d95986a803ef1caf3cbd70e24cd8cd69da924d06fb70b Apr 16 22:20:31.030615 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:31.030578 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" event={"ID":"c9bd0c67-e39a-4b4f-9187-87687d5dc182","Type":"ContainerStarted","Data":"f9d12ebe84b3294ca70d95986a803ef1caf3cbd70e24cd8cd69da924d06fb70b"} Apr 16 22:20:34.043332 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:34.043288 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" event={"ID":"c9bd0c67-e39a-4b4f-9187-87687d5dc182","Type":"ContainerStarted","Data":"598e3703021668e293589e4e6ebfd4425c9fd6829cdc868b022c77eeefb35062"} Apr 16 22:20:34.043812 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:34.043362 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" Apr 16 22:20:34.044633 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:34.044611 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-h4xp9" event={"ID":"08407eaf-053b-482d-958c-ee5b0a4357bd","Type":"ContainerStarted","Data":"179c9e9b1dffaad0ce5136fe39065145c86b1a15b85307b4c7395ed7932ad001"} Apr 16 22:20:34.044726 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:34.044711 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-h4xp9" Apr 16 22:20:34.060458 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:34.060410 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" podStartSLOduration=1.5050269379999999 podStartE2EDuration="5.060399469s" podCreationTimestamp="2026-04-16 22:20:29 +0000 UTC" firstStartedPulling="2026-04-16 22:20:30.149192996 +0000 UTC m=+413.080517850" lastFinishedPulling="2026-04-16 22:20:33.704565528 +0000 UTC m=+416.635890381" observedRunningTime="2026-04-16 22:20:34.058135227 +0000 UTC m=+416.989460089" watchObservedRunningTime="2026-04-16 22:20:34.060399469 +0000 UTC m=+416.991724438" Apr 16 22:20:34.072991 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:34.072951 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-h4xp9" podStartSLOduration=0.899638939 podStartE2EDuration="5.072942186s" podCreationTimestamp="2026-04-16 22:20:29 +0000 UTC" firstStartedPulling="2026-04-16 22:20:29.586804726 +0000 UTC m=+412.518129572" lastFinishedPulling="2026-04-16 22:20:33.760107976 +0000 UTC m=+416.691432819" observedRunningTime="2026-04-16 22:20:34.071277262 +0000 UTC m=+417.002602124" watchObservedRunningTime="2026-04-16 22:20:34.072942186 +0000 UTC m=+417.004267048" Apr 16 22:20:40.050484 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:20:40.050451 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-h4xp9" Apr 16 22:21:05.052512 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.052483 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" Apr 16 22:21:05.255880 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.255844 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-p68c7"] Apr 16 22:21:05.256085 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.256047 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" podUID="c9bd0c67-e39a-4b4f-9187-87687d5dc182" containerName="manager" containerID="cri-o://598e3703021668e293589e4e6ebfd4425c9fd6829cdc868b022c77eeefb35062" gracePeriod=10 Apr 16 22:21:05.287010 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.286987 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-snfdj"] Apr 16 22:21:05.290401 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.290386 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-snfdj" Apr 16 22:21:05.311913 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.311857 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-snfdj"] Apr 16 22:21:05.444014 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.443985 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f-cert\") pod \"kserve-controller-manager-84d7d5cfc6-snfdj\" (UID: \"4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-snfdj" Apr 16 22:21:05.444151 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.444021 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npkwz\" (UniqueName: \"kubernetes.io/projected/4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f-kube-api-access-npkwz\") pod \"kserve-controller-manager-84d7d5cfc6-snfdj\" (UID: \"4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-snfdj" Apr 16 22:21:05.492992 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.492968 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" Apr 16 22:21:05.544693 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.544662 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f-cert\") pod \"kserve-controller-manager-84d7d5cfc6-snfdj\" (UID: \"4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-snfdj" Apr 16 22:21:05.544693 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.544695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npkwz\" (UniqueName: \"kubernetes.io/projected/4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f-kube-api-access-npkwz\") pod \"kserve-controller-manager-84d7d5cfc6-snfdj\" (UID: \"4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-snfdj" Apr 16 22:21:05.547062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.547029 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f-cert\") pod \"kserve-controller-manager-84d7d5cfc6-snfdj\" (UID: \"4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-snfdj" Apr 16 22:21:05.552536 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.552506 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npkwz\" (UniqueName: \"kubernetes.io/projected/4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f-kube-api-access-npkwz\") pod \"kserve-controller-manager-84d7d5cfc6-snfdj\" (UID: \"4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-snfdj" Apr 16 22:21:05.645596 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.645493 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-snfdj" Apr 16 22:21:05.645596 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.645587 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vfgt\" (UniqueName: \"kubernetes.io/projected/c9bd0c67-e39a-4b4f-9187-87687d5dc182-kube-api-access-4vfgt\") pod \"c9bd0c67-e39a-4b4f-9187-87687d5dc182\" (UID: \"c9bd0c67-e39a-4b4f-9187-87687d5dc182\") " Apr 16 22:21:05.645767 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.645622 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9bd0c67-e39a-4b4f-9187-87687d5dc182-cert\") pod \"c9bd0c67-e39a-4b4f-9187-87687d5dc182\" (UID: \"c9bd0c67-e39a-4b4f-9187-87687d5dc182\") " Apr 16 22:21:05.647737 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.647711 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9bd0c67-e39a-4b4f-9187-87687d5dc182-kube-api-access-4vfgt" (OuterVolumeSpecName: "kube-api-access-4vfgt") pod "c9bd0c67-e39a-4b4f-9187-87687d5dc182" (UID: "c9bd0c67-e39a-4b4f-9187-87687d5dc182"). InnerVolumeSpecName "kube-api-access-4vfgt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:21:05.647803 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.647707 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bd0c67-e39a-4b4f-9187-87687d5dc182-cert" (OuterVolumeSpecName: "cert") pod "c9bd0c67-e39a-4b4f-9187-87687d5dc182" (UID: "c9bd0c67-e39a-4b4f-9187-87687d5dc182"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:21:05.747403 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.747368 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4vfgt\" (UniqueName: \"kubernetes.io/projected/c9bd0c67-e39a-4b4f-9187-87687d5dc182-kube-api-access-4vfgt\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:21:05.747403 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.747405 2571 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9bd0c67-e39a-4b4f-9187-87687d5dc182-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:21:05.780082 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:05.780061 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-snfdj"] Apr 16 22:21:05.782040 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:21:05.782011 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cbc9c3f_0ce4_4d17_959b_3b8d2b50fb9f.slice/crio-be9a341002e95b87ad1d7cf3b213549db94a2c5dc2f90ad5f6cbbfdd11c12c29 WatchSource:0}: Error finding container be9a341002e95b87ad1d7cf3b213549db94a2c5dc2f90ad5f6cbbfdd11c12c29: Status 404 returned error can't find the container with id be9a341002e95b87ad1d7cf3b213549db94a2c5dc2f90ad5f6cbbfdd11c12c29 Apr 16 22:21:06.148828 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:06.148792 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-snfdj" event={"ID":"4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f","Type":"ContainerStarted","Data":"be9a341002e95b87ad1d7cf3b213549db94a2c5dc2f90ad5f6cbbfdd11c12c29"} Apr 16 22:21:06.149884 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:06.149862 2571 generic.go:358] "Generic (PLEG): container finished" podID="c9bd0c67-e39a-4b4f-9187-87687d5dc182" containerID="598e3703021668e293589e4e6ebfd4425c9fd6829cdc868b022c77eeefb35062" exitCode=0 Apr 16 22:21:06.149964 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:06.149923 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" Apr 16 22:21:06.150012 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:06.149924 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" event={"ID":"c9bd0c67-e39a-4b4f-9187-87687d5dc182","Type":"ContainerDied","Data":"598e3703021668e293589e4e6ebfd4425c9fd6829cdc868b022c77eeefb35062"} Apr 16 22:21:06.150048 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:06.150021 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-p68c7" event={"ID":"c9bd0c67-e39a-4b4f-9187-87687d5dc182","Type":"ContainerDied","Data":"f9d12ebe84b3294ca70d95986a803ef1caf3cbd70e24cd8cd69da924d06fb70b"} Apr 16 22:21:06.150048 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:06.150039 2571 scope.go:117] "RemoveContainer" containerID="598e3703021668e293589e4e6ebfd4425c9fd6829cdc868b022c77eeefb35062" Apr 16 22:21:06.162190 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:06.162162 2571 scope.go:117] "RemoveContainer" containerID="598e3703021668e293589e4e6ebfd4425c9fd6829cdc868b022c77eeefb35062" Apr 16 22:21:06.162428 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:21:06.162407 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"598e3703021668e293589e4e6ebfd4425c9fd6829cdc868b022c77eeefb35062\": container with ID starting with 598e3703021668e293589e4e6ebfd4425c9fd6829cdc868b022c77eeefb35062 not found: ID does not exist" containerID="598e3703021668e293589e4e6ebfd4425c9fd6829cdc868b022c77eeefb35062" Apr 16 22:21:06.162487 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:06.162435 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598e3703021668e293589e4e6ebfd4425c9fd6829cdc868b022c77eeefb35062"} err="failed to get container status \"598e3703021668e293589e4e6ebfd4425c9fd6829cdc868b022c77eeefb35062\": rpc error: code = NotFound desc = could not find container \"598e3703021668e293589e4e6ebfd4425c9fd6829cdc868b022c77eeefb35062\": container with ID starting with 598e3703021668e293589e4e6ebfd4425c9fd6829cdc868b022c77eeefb35062 not found: ID does not exist" Apr 16 22:21:06.165226 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:06.165205 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-p68c7"] Apr 16 22:21:06.167302 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:06.167284 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-p68c7"] Apr 16 22:21:07.155574 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:07.155519 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-snfdj" event={"ID":"4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f","Type":"ContainerStarted","Data":"14cabec8f979d7523439cd7d7ee593a62eefb94382e8ba88052d0bbc9a88a2e1"} Apr 16 22:21:07.156019 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:07.155585 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-snfdj" Apr 16 22:21:07.171600 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:07.171512 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84d7d5cfc6-snfdj" podStartSLOduration=1.782980631 podStartE2EDuration="2.171495227s" podCreationTimestamp="2026-04-16 22:21:05 +0000 UTC" firstStartedPulling="2026-04-16 22:21:05.783279165 +0000 UTC m=+448.714604006" lastFinishedPulling="2026-04-16 22:21:06.171793748 +0000 UTC m=+449.103118602" observedRunningTime="2026-04-16 22:21:07.170149997 +0000 UTC m=+450.101474859" watchObservedRunningTime="2026-04-16 22:21:07.171495227 +0000 UTC m=+450.102820089" Apr 16 22:21:07.678443 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:07.678410 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9bd0c67-e39a-4b4f-9187-87687d5dc182" path="/var/lib/kubelet/pods/c9bd0c67-e39a-4b4f-9187-87687d5dc182/volumes" Apr 16 22:21:38.165282 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:38.165250 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-snfdj" Apr 16 22:21:38.999926 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:38.999889 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-89t7g"] Apr 16 22:21:39.000254 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.000240 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9bd0c67-e39a-4b4f-9187-87687d5dc182" containerName="manager" Apr 16 22:21:39.000322 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.000254 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9bd0c67-e39a-4b4f-9187-87687d5dc182" containerName="manager" Apr 16 22:21:39.000322 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.000307 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9bd0c67-e39a-4b4f-9187-87687d5dc182" containerName="manager" Apr 16 22:21:39.004585 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.004565 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-89t7g" Apr 16 22:21:39.007061 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.007035 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 22:21:39.007308 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.007293 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-gh68r\"" Apr 16 22:21:39.012293 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.012270 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-89t7g"] Apr 16 22:21:39.014031 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.014003 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59fnd\" (UniqueName: \"kubernetes.io/projected/edcb5ce6-dde1-4d26-bd47-0da4c3382dca-kube-api-access-59fnd\") pod \"model-serving-api-86f7b4b499-89t7g\" (UID: \"edcb5ce6-dde1-4d26-bd47-0da4c3382dca\") " pod="kserve/model-serving-api-86f7b4b499-89t7g" Apr 16 22:21:39.014129 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.014054 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/edcb5ce6-dde1-4d26-bd47-0da4c3382dca-tls-certs\") pod \"model-serving-api-86f7b4b499-89t7g\" (UID: \"edcb5ce6-dde1-4d26-bd47-0da4c3382dca\") " pod="kserve/model-serving-api-86f7b4b499-89t7g" Apr 16 22:21:39.016043 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.016023 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-4k48d"] Apr 16 22:21:39.019437 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.019418 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-4k48d" Apr 16 22:21:39.021783 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.021759 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-7px2b\"" Apr 16 22:21:39.021878 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.021762 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 22:21:39.029753 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.029726 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-4k48d"] Apr 16 22:21:39.115049 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.115013 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b3ebba6-88e3-47e5-bbea-2c83998dd7c7-cert\") pod \"odh-model-controller-696fc77849-4k48d\" (UID: \"7b3ebba6-88e3-47e5-bbea-2c83998dd7c7\") " pod="kserve/odh-model-controller-696fc77849-4k48d" Apr 16 22:21:39.115412 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.115388 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59fnd\" (UniqueName: \"kubernetes.io/projected/edcb5ce6-dde1-4d26-bd47-0da4c3382dca-kube-api-access-59fnd\") pod \"model-serving-api-86f7b4b499-89t7g\" (UID: \"edcb5ce6-dde1-4d26-bd47-0da4c3382dca\") " pod="kserve/model-serving-api-86f7b4b499-89t7g" Apr 16 22:21:39.115508 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.115451 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/edcb5ce6-dde1-4d26-bd47-0da4c3382dca-tls-certs\") pod \"model-serving-api-86f7b4b499-89t7g\" (UID: \"edcb5ce6-dde1-4d26-bd47-0da4c3382dca\") " pod="kserve/model-serving-api-86f7b4b499-89t7g" Apr 16 22:21:39.115583 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.115509 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fw8p\" (UniqueName: \"kubernetes.io/projected/7b3ebba6-88e3-47e5-bbea-2c83998dd7c7-kube-api-access-9fw8p\") pod \"odh-model-controller-696fc77849-4k48d\" (UID: \"7b3ebba6-88e3-47e5-bbea-2c83998dd7c7\") " pod="kserve/odh-model-controller-696fc77849-4k48d" Apr 16 22:21:39.117961 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.117933 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/edcb5ce6-dde1-4d26-bd47-0da4c3382dca-tls-certs\") pod \"model-serving-api-86f7b4b499-89t7g\" (UID: \"edcb5ce6-dde1-4d26-bd47-0da4c3382dca\") " pod="kserve/model-serving-api-86f7b4b499-89t7g" Apr 16 22:21:39.123906 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.123884 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59fnd\" (UniqueName: \"kubernetes.io/projected/edcb5ce6-dde1-4d26-bd47-0da4c3382dca-kube-api-access-59fnd\") pod \"model-serving-api-86f7b4b499-89t7g\" (UID: \"edcb5ce6-dde1-4d26-bd47-0da4c3382dca\") " pod="kserve/model-serving-api-86f7b4b499-89t7g" Apr 16 22:21:39.216596 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.216537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fw8p\" (UniqueName: \"kubernetes.io/projected/7b3ebba6-88e3-47e5-bbea-2c83998dd7c7-kube-api-access-9fw8p\") pod \"odh-model-controller-696fc77849-4k48d\" (UID: \"7b3ebba6-88e3-47e5-bbea-2c83998dd7c7\") " pod="kserve/odh-model-controller-696fc77849-4k48d" Apr 16 22:21:39.217035 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.216629 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b3ebba6-88e3-47e5-bbea-2c83998dd7c7-cert\") pod \"odh-model-controller-696fc77849-4k48d\" (UID: \"7b3ebba6-88e3-47e5-bbea-2c83998dd7c7\") " pod="kserve/odh-model-controller-696fc77849-4k48d" Apr 16 22:21:39.217035 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:21:39.216735 2571 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 22:21:39.217035 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:21:39.216793 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b3ebba6-88e3-47e5-bbea-2c83998dd7c7-cert podName:7b3ebba6-88e3-47e5-bbea-2c83998dd7c7 nodeName:}" failed. No retries permitted until 2026-04-16 22:21:39.716776911 +0000 UTC m=+482.648101750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7b3ebba6-88e3-47e5-bbea-2c83998dd7c7-cert") pod "odh-model-controller-696fc77849-4k48d" (UID: "7b3ebba6-88e3-47e5-bbea-2c83998dd7c7") : secret "odh-model-controller-webhook-cert" not found Apr 16 22:21:39.225784 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.225762 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fw8p\" (UniqueName: \"kubernetes.io/projected/7b3ebba6-88e3-47e5-bbea-2c83998dd7c7-kube-api-access-9fw8p\") pod \"odh-model-controller-696fc77849-4k48d\" (UID: \"7b3ebba6-88e3-47e5-bbea-2c83998dd7c7\") " pod="kserve/odh-model-controller-696fc77849-4k48d" Apr 16 22:21:39.317695 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.317604 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-89t7g" Apr 16 22:21:39.442689 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.442666 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-89t7g"] Apr 16 22:21:39.444397 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:21:39.444373 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedcb5ce6_dde1_4d26_bd47_0da4c3382dca.slice/crio-f6c328b0c2d25718408e2eaec3c442b654eacd532fc147e4d9f7df57e51e51db WatchSource:0}: Error finding container f6c328b0c2d25718408e2eaec3c442b654eacd532fc147e4d9f7df57e51e51db: Status 404 returned error can't find the container with id f6c328b0c2d25718408e2eaec3c442b654eacd532fc147e4d9f7df57e51e51db Apr 16 22:21:39.721058 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.720971 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b3ebba6-88e3-47e5-bbea-2c83998dd7c7-cert\") pod \"odh-model-controller-696fc77849-4k48d\" (UID: \"7b3ebba6-88e3-47e5-bbea-2c83998dd7c7\") " pod="kserve/odh-model-controller-696fc77849-4k48d" Apr 16 22:21:39.723328 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.723302 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b3ebba6-88e3-47e5-bbea-2c83998dd7c7-cert\") pod \"odh-model-controller-696fc77849-4k48d\" (UID: \"7b3ebba6-88e3-47e5-bbea-2c83998dd7c7\") " pod="kserve/odh-model-controller-696fc77849-4k48d" Apr 16 22:21:39.931977 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:39.931945 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-4k48d" Apr 16 22:21:40.050029 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:40.049995 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-4k48d"] Apr 16 22:21:40.052844 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:21:40.052817 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b3ebba6_88e3_47e5_bbea_2c83998dd7c7.slice/crio-986ad051606d4dfecdcf8441fb3f860f0ee31dd58461a4eadaf33b8bec71d7fb WatchSource:0}: Error finding container 986ad051606d4dfecdcf8441fb3f860f0ee31dd58461a4eadaf33b8bec71d7fb: Status 404 returned error can't find the container with id 986ad051606d4dfecdcf8441fb3f860f0ee31dd58461a4eadaf33b8bec71d7fb Apr 16 22:21:40.266064 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:40.266012 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-4k48d" event={"ID":"7b3ebba6-88e3-47e5-bbea-2c83998dd7c7","Type":"ContainerStarted","Data":"986ad051606d4dfecdcf8441fb3f860f0ee31dd58461a4eadaf33b8bec71d7fb"} Apr 16 22:21:40.267228 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:40.267203 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-89t7g" event={"ID":"edcb5ce6-dde1-4d26-bd47-0da4c3382dca","Type":"ContainerStarted","Data":"f6c328b0c2d25718408e2eaec3c442b654eacd532fc147e4d9f7df57e51e51db"} Apr 16 22:21:42.276952 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:42.276902 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-89t7g" event={"ID":"edcb5ce6-dde1-4d26-bd47-0da4c3382dca","Type":"ContainerStarted","Data":"252f000240919805cf815b187f47babbff98510dd021cbbd53029053fe83d404"} Apr 16 22:21:42.277416 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:42.277031 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-89t7g" Apr 16 22:21:42.309314 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:42.309266 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-89t7g" podStartSLOduration=2.347021184 podStartE2EDuration="4.309250847s" podCreationTimestamp="2026-04-16 22:21:38 +0000 UTC" firstStartedPulling="2026-04-16 22:21:39.446256086 +0000 UTC m=+482.377580926" lastFinishedPulling="2026-04-16 22:21:41.408485743 +0000 UTC m=+484.339810589" observedRunningTime="2026-04-16 22:21:42.308947227 +0000 UTC m=+485.240272091" watchObservedRunningTime="2026-04-16 22:21:42.309250847 +0000 UTC m=+485.240575736" Apr 16 22:21:43.282616 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:43.282583 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-4k48d" event={"ID":"7b3ebba6-88e3-47e5-bbea-2c83998dd7c7","Type":"ContainerStarted","Data":"262a5bcdb728a49674e4915c15a5ed674ab98663c705f354a83a8c2d9df73eed"} Apr 16 22:21:43.282973 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:43.282708 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-4k48d" Apr 16 22:21:43.312662 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:43.312536 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-4k48d" podStartSLOduration=2.314954197 podStartE2EDuration="5.312517509s" podCreationTimestamp="2026-04-16 22:21:38 +0000 UTC" firstStartedPulling="2026-04-16 22:21:40.054098374 +0000 UTC m=+482.985423213" lastFinishedPulling="2026-04-16 22:21:43.051661671 +0000 UTC m=+485.982986525" observedRunningTime="2026-04-16 22:21:43.311993327 +0000 UTC m=+486.243318188" watchObservedRunningTime="2026-04-16 22:21:43.312517509 +0000 UTC m=+486.243842372" Apr 16 22:21:53.287337 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:53.287304 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-89t7g" Apr 16 22:21:54.288166 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:21:54.288129 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-4k48d" Apr 16 22:22:09.254189 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:09.254145 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cd9cfb597-xf4kk"] Apr 16 22:22:14.592470 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.592440 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq"] Apr 16 22:22:14.599596 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.599575 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:14.602243 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.602223 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-c8805-predictor-serving-cert\"" Apr 16 22:22:14.602360 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.602274 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:22:14.602360 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.602295 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config\"" Apr 16 22:22:14.602360 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.602320 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gj7r7\"" Apr 16 22:22:14.603228 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.603210 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:22:14.607086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.607064 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq"] Apr 16 22:22:14.713542 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.713507 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c42a725-a720-4322-b803-0353062f18fb-proxy-tls\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:14.713730 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.713578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c42a725-a720-4322-b803-0353062f18fb-isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:14.713730 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.713655 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c42a725-a720-4322-b803-0353062f18fb-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:14.713730 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.713714 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s7nh\" (UniqueName: \"kubernetes.io/projected/0c42a725-a720-4322-b803-0353062f18fb-kube-api-access-2s7nh\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:14.814218 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.814178 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c42a725-a720-4322-b803-0353062f18fb-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:14.814393 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.814260 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2s7nh\" (UniqueName: \"kubernetes.io/projected/0c42a725-a720-4322-b803-0353062f18fb-kube-api-access-2s7nh\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:14.814393 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.814337 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c42a725-a720-4322-b803-0353062f18fb-proxy-tls\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:14.814393 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.814371 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c42a725-a720-4322-b803-0353062f18fb-isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:14.814540 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:22:14.814481 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-serving-cert: secret "isvc-raw-sklearn-batcher-c8805-predictor-serving-cert" not found Apr 16 22:22:14.814599 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:22:14.814573 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c42a725-a720-4322-b803-0353062f18fb-proxy-tls podName:0c42a725-a720-4322-b803-0353062f18fb nodeName:}" failed. No retries permitted until 2026-04-16 22:22:15.314541785 +0000 UTC m=+518.245866624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0c42a725-a720-4322-b803-0353062f18fb-proxy-tls") pod "isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" (UID: "0c42a725-a720-4322-b803-0353062f18fb") : secret "isvc-raw-sklearn-batcher-c8805-predictor-serving-cert" not found Apr 16 22:22:14.814646 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.814628 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c42a725-a720-4322-b803-0353062f18fb-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:14.815068 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.815047 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c42a725-a720-4322-b803-0353062f18fb-isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:14.822624 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:14.822597 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s7nh\" (UniqueName: \"kubernetes.io/projected/0c42a725-a720-4322-b803-0353062f18fb-kube-api-access-2s7nh\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:15.318784 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:15.318726 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c42a725-a720-4322-b803-0353062f18fb-proxy-tls\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:15.318970 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:22:15.318897 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-serving-cert: secret "isvc-raw-sklearn-batcher-c8805-predictor-serving-cert" not found Apr 16 22:22:15.319012 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:22:15.318974 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c42a725-a720-4322-b803-0353062f18fb-proxy-tls podName:0c42a725-a720-4322-b803-0353062f18fb nodeName:}" failed. No retries permitted until 2026-04-16 22:22:16.318959293 +0000 UTC m=+519.250284137 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0c42a725-a720-4322-b803-0353062f18fb-proxy-tls") pod "isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" (UID: "0c42a725-a720-4322-b803-0353062f18fb") : secret "isvc-raw-sklearn-batcher-c8805-predictor-serving-cert" not found Apr 16 22:22:16.327580 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:16.327513 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c42a725-a720-4322-b803-0353062f18fb-proxy-tls\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:16.329958 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:16.329938 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c42a725-a720-4322-b803-0353062f18fb-proxy-tls\") pod \"isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:16.411681 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:16.411646 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:16.542034 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:16.542005 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq"] Apr 16 22:22:16.544570 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:22:16.544518 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c42a725_a720_4322_b803_0353062f18fb.slice/crio-5bcc26aa75735402ae07dee281298a0164fbc389dc29c5f75f25b7c8e626078b WatchSource:0}: Error finding container 5bcc26aa75735402ae07dee281298a0164fbc389dc29c5f75f25b7c8e626078b: Status 404 returned error can't find the container with id 5bcc26aa75735402ae07dee281298a0164fbc389dc29c5f75f25b7c8e626078b Apr 16 22:22:17.399724 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:17.399686 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" event={"ID":"0c42a725-a720-4322-b803-0353062f18fb","Type":"ContainerStarted","Data":"5bcc26aa75735402ae07dee281298a0164fbc389dc29c5f75f25b7c8e626078b"} Apr 16 22:22:21.417914 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:21.417871 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" event={"ID":"0c42a725-a720-4322-b803-0353062f18fb","Type":"ContainerStarted","Data":"8e222d428260af826190031d41d11e89130bcca51a5ac0e833d14b9af6b9777d"} Apr 16 22:22:24.429717 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:24.429681 2571 generic.go:358] "Generic (PLEG): container finished" podID="0c42a725-a720-4322-b803-0353062f18fb" containerID="8e222d428260af826190031d41d11e89130bcca51a5ac0e833d14b9af6b9777d" exitCode=0 Apr 16 22:22:24.430101 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:24.429750 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" event={"ID":"0c42a725-a720-4322-b803-0353062f18fb","Type":"ContainerDied","Data":"8e222d428260af826190031d41d11e89130bcca51a5ac0e833d14b9af6b9777d"} Apr 16 22:22:34.275162 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:34.275015 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6cd9cfb597-xf4kk" podUID="3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44" containerName="console" containerID="cri-o://0455061c9d9631bec01e2e0e9ab558c9e26922291a505c3aa80f7e72bca7bfab" gracePeriod=15 Apr 16 22:22:37.141114 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.141092 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cd9cfb597-xf4kk_3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44/console/0.log" Apr 16 22:22:37.141460 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.141155 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:22:37.218285 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.218249 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trkl9\" (UniqueName: \"kubernetes.io/projected/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-kube-api-access-trkl9\") pod \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " Apr 16 22:22:37.218478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.218298 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-serving-cert\") pod \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " Apr 16 22:22:37.218478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.218323 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-service-ca\") pod \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " Apr 16 22:22:37.218478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.218358 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-trusted-ca-bundle\") pod \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " Apr 16 22:22:37.218478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.218376 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-config\") pod \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " Apr 16 22:22:37.218478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.218399 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-oauth-serving-cert\") pod \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " Apr 16 22:22:37.218478 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.218433 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-oauth-config\") pod \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\" (UID: \"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44\") " Apr 16 22:22:37.218879 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.218817 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44" (UID: "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:22:37.218879 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.218852 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-service-ca" (OuterVolumeSpecName: "service-ca") pod "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44" (UID: "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:22:37.218976 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.218867 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-config" (OuterVolumeSpecName: "console-config") pod "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44" (UID: "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:22:37.218976 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.218857 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44" (UID: "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:22:37.220612 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.220580 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44" (UID: "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:22:37.220727 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.220656 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44" (UID: "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:22:37.220783 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.220764 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-kube-api-access-trkl9" (OuterVolumeSpecName: "kube-api-access-trkl9") pod "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44" (UID: "3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44"). InnerVolumeSpecName "kube-api-access-trkl9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:22:37.319065 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.319027 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-serving-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:22:37.319065 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.319057 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-service-ca\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:22:37.319065 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.319067 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-trusted-ca-bundle\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:22:37.319300 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.319076 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:22:37.319300 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.319087 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-oauth-serving-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:22:37.319300 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.319096 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-console-oauth-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:22:37.319300 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.319105 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trkl9\" (UniqueName: \"kubernetes.io/projected/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44-kube-api-access-trkl9\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:22:37.482724 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.482678 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" event={"ID":"0c42a725-a720-4322-b803-0353062f18fb","Type":"ContainerStarted","Data":"74d3807a769ed310c5587ccc582fb3e9995e17be47d553253201a754419e3f28"} Apr 16 22:22:37.483777 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.483761 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cd9cfb597-xf4kk_3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44/console/0.log" Apr 16 22:22:37.483865 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.483797 2571 generic.go:358] "Generic (PLEG): container finished" podID="3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44" containerID="0455061c9d9631bec01e2e0e9ab558c9e26922291a505c3aa80f7e72bca7bfab" exitCode=2 Apr 16 22:22:37.483912 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.483866 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cd9cfb597-xf4kk" Apr 16 22:22:37.483912 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.483876 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cd9cfb597-xf4kk" event={"ID":"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44","Type":"ContainerDied","Data":"0455061c9d9631bec01e2e0e9ab558c9e26922291a505c3aa80f7e72bca7bfab"} Apr 16 22:22:37.483912 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.483898 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cd9cfb597-xf4kk" event={"ID":"3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44","Type":"ContainerDied","Data":"47ddcc63845fefd2e546f2acda31f06c2701626dc3d32da6b59ccaef21b40bd9"} Apr 16 22:22:37.484042 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.483914 2571 scope.go:117] "RemoveContainer" containerID="0455061c9d9631bec01e2e0e9ab558c9e26922291a505c3aa80f7e72bca7bfab" Apr 16 22:22:37.492400 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.492379 2571 scope.go:117] "RemoveContainer" containerID="0455061c9d9631bec01e2e0e9ab558c9e26922291a505c3aa80f7e72bca7bfab" Apr 16 22:22:37.492712 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:22:37.492693 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0455061c9d9631bec01e2e0e9ab558c9e26922291a505c3aa80f7e72bca7bfab\": container with ID starting with 0455061c9d9631bec01e2e0e9ab558c9e26922291a505c3aa80f7e72bca7bfab not found: ID does not exist" containerID="0455061c9d9631bec01e2e0e9ab558c9e26922291a505c3aa80f7e72bca7bfab" Apr 16 22:22:37.492764 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.492721 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0455061c9d9631bec01e2e0e9ab558c9e26922291a505c3aa80f7e72bca7bfab"} err="failed to get container status \"0455061c9d9631bec01e2e0e9ab558c9e26922291a505c3aa80f7e72bca7bfab\": rpc error: code = NotFound desc = could not find container \"0455061c9d9631bec01e2e0e9ab558c9e26922291a505c3aa80f7e72bca7bfab\": container with ID starting with 0455061c9d9631bec01e2e0e9ab558c9e26922291a505c3aa80f7e72bca7bfab not found: ID does not exist" Apr 16 22:22:37.506617 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.506589 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cd9cfb597-xf4kk"] Apr 16 22:22:37.510459 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.510433 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6cd9cfb597-xf4kk"] Apr 16 22:22:37.681688 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:37.681653 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44" path="/var/lib/kubelet/pods/3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44/volumes" Apr 16 22:22:40.498753 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:40.498714 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" event={"ID":"0c42a725-a720-4322-b803-0353062f18fb","Type":"ContainerStarted","Data":"1341eacb50d021bdb8180a7b927c91a4ffc6c71cb7b463f8d087dccdb9c8cac8"} Apr 16 22:22:42.508397 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:42.508363 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" event={"ID":"0c42a725-a720-4322-b803-0353062f18fb","Type":"ContainerStarted","Data":"b7f8404e53c773dbd984fddcc2b6c33d1b1ac2a2de25a914d67774ee7e2edc80"} Apr 16 22:22:42.508872 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:42.508581 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:42.533994 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:42.533940 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podStartSLOduration=2.669820459 podStartE2EDuration="28.533922592s" podCreationTimestamp="2026-04-16 22:22:14 +0000 UTC" firstStartedPulling="2026-04-16 22:22:16.54678916 +0000 UTC m=+519.478114015" lastFinishedPulling="2026-04-16 22:22:42.410891308 +0000 UTC m=+545.342216148" observedRunningTime="2026-04-16 22:22:42.533386827 +0000 UTC m=+545.464711689" watchObservedRunningTime="2026-04-16 22:22:42.533922592 +0000 UTC m=+545.465247454" Apr 16 22:22:43.511902 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:43.511867 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:43.511902 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:43.511906 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:43.513456 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:43.513430 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:22:43.514177 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:43.514150 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:22:44.515211 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:44.515169 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:22:44.516200 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:44.516172 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:22:44.519567 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:44.519531 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:22:45.518526 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:45.518487 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:22:45.518979 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:45.518866 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:22:55.518778 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:55.518727 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:22:55.519259 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:22:55.519129 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:23:05.518718 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:05.518667 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:23:05.519116 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:05.519080 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:23:15.518914 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:15.518864 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:23:15.519405 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:15.519276 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:23:25.518563 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:25.518505 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:23:25.518943 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:25.518874 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:23:35.518579 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:35.518516 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:23:35.519036 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:35.519013 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:23:37.570622 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:37.570594 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/1.log" Apr 16 22:23:37.571026 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:37.570841 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/1.log" Apr 16 22:23:45.520133 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:45.520099 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:23:45.520705 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:45.520190 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:23:59.747463 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:59.747369 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq"] Apr 16 22:23:59.749990 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:59.747842 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" containerID="cri-o://74d3807a769ed310c5587ccc582fb3e9995e17be47d553253201a754419e3f28" gracePeriod=30 Apr 16 22:23:59.749990 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:59.747980 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kube-rbac-proxy" containerID="cri-o://1341eacb50d021bdb8180a7b927c91a4ffc6c71cb7b463f8d087dccdb9c8cac8" gracePeriod=30 Apr 16 22:23:59.749990 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:59.747880 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" containerID="cri-o://b7f8404e53c773dbd984fddcc2b6c33d1b1ac2a2de25a914d67774ee7e2edc80" gracePeriod=30 Apr 16 22:23:59.843013 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:59.842985 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h"] Apr 16 22:23:59.843438 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:59.843421 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44" containerName="console" Apr 16 22:23:59.843521 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:59.843440 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44" containerName="console" Apr 16 22:23:59.843607 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:59.843526 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a3dbfbd-d2e3-45f6-8cc6-0ccc8df37b44" containerName="console" Apr 16 22:23:59.847106 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:59.847084 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:23:59.849469 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:59.849418 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config\"" Apr 16 22:23:59.849469 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:59.849433 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-b6a6b-predictor-serving-cert\"" Apr 16 22:23:59.863215 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:23:59.863189 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h"] Apr 16 22:24:00.015201 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.015128 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:00.015201 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.015170 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g77p\" (UniqueName: \"kubernetes.io/projected/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-kube-api-access-5g77p\") pod \"isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:00.015385 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.015335 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-proxy-tls\") pod \"isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:00.015385 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.015362 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:00.116745 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.116704 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-proxy-tls\") pod \"isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:00.116745 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.116748 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:00.116953 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.116773 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:00.116953 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.116807 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g77p\" (UniqueName: \"kubernetes.io/projected/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-kube-api-access-5g77p\") pod \"isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:00.117215 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.117190 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:00.117580 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.117534 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:00.119237 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.119219 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-proxy-tls\") pod \"isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:00.124940 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.124919 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g77p\" (UniqueName: \"kubernetes.io/projected/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-kube-api-access-5g77p\") pod \"isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:00.162203 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.162168 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:00.287785 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.287756 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h"] Apr 16 22:24:00.290386 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:24:00.290355 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59a7617f_457d_47bc_b9cc_7a633ccb9c3c.slice/crio-efc880c77aafeed45cc279ee6170f9a13d20886f1bece518c692192e4a4f4d6a WatchSource:0}: Error finding container efc880c77aafeed45cc279ee6170f9a13d20886f1bece518c692192e4a4f4d6a: Status 404 returned error can't find the container with id efc880c77aafeed45cc279ee6170f9a13d20886f1bece518c692192e4a4f4d6a Apr 16 22:24:00.292251 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.292237 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:24:00.783442 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.783407 2571 generic.go:358] "Generic (PLEG): container finished" podID="0c42a725-a720-4322-b803-0353062f18fb" containerID="1341eacb50d021bdb8180a7b927c91a4ffc6c71cb7b463f8d087dccdb9c8cac8" exitCode=2 Apr 16 22:24:00.783859 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.783450 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" event={"ID":"0c42a725-a720-4322-b803-0353062f18fb","Type":"ContainerDied","Data":"1341eacb50d021bdb8180a7b927c91a4ffc6c71cb7b463f8d087dccdb9c8cac8"} Apr 16 22:24:00.784893 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.784867 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" event={"ID":"59a7617f-457d-47bc-b9cc-7a633ccb9c3c","Type":"ContainerStarted","Data":"b7101605cb452e33024194a476207e3b5ccd869853b733a8019f7f52b19e27ce"} Apr 16 22:24:00.785021 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:00.784899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" event={"ID":"59a7617f-457d-47bc-b9cc-7a633ccb9c3c","Type":"ContainerStarted","Data":"efc880c77aafeed45cc279ee6170f9a13d20886f1bece518c692192e4a4f4d6a"} Apr 16 22:24:04.515743 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:04.515710 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 16 22:24:04.800226 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:04.800139 2571 generic.go:358] "Generic (PLEG): container finished" podID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerID="b7101605cb452e33024194a476207e3b5ccd869853b733a8019f7f52b19e27ce" exitCode=0 Apr 16 22:24:04.800371 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:04.800219 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" event={"ID":"59a7617f-457d-47bc-b9cc-7a633ccb9c3c","Type":"ContainerDied","Data":"b7101605cb452e33024194a476207e3b5ccd869853b733a8019f7f52b19e27ce"} Apr 16 22:24:05.519215 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:05.519168 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:24:05.519686 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:05.519512 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:24:05.807215 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:05.807138 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" event={"ID":"59a7617f-457d-47bc-b9cc-7a633ccb9c3c","Type":"ContainerStarted","Data":"0df9cd068ff40da269bff1286fac652a1e49b2d6b497f9eacd0e79a3c62f6dfe"} Apr 16 22:24:05.807215 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:05.807184 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" event={"ID":"59a7617f-457d-47bc-b9cc-7a633ccb9c3c","Type":"ContainerStarted","Data":"3ebd2d1c388e790ea63d9801ddb39b008f898a2a0d22e7c40edbaad45e0435bc"} Apr 16 22:24:05.807407 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:05.807387 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:05.825981 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:05.825933 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" podStartSLOduration=6.825920227 podStartE2EDuration="6.825920227s" podCreationTimestamp="2026-04-16 22:23:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:24:05.823616476 +0000 UTC m=+628.754941338" watchObservedRunningTime="2026-04-16 22:24:05.825920227 +0000 UTC m=+628.757245089" Apr 16 22:24:06.810757 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:06.810723 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:06.811856 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:06.811831 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:24:07.816030 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:07.815996 2571 generic.go:358] "Generic (PLEG): container finished" podID="0c42a725-a720-4322-b803-0353062f18fb" containerID="74d3807a769ed310c5587ccc582fb3e9995e17be47d553253201a754419e3f28" exitCode=0 Apr 16 22:24:07.816389 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:07.816067 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" event={"ID":"0c42a725-a720-4322-b803-0353062f18fb","Type":"ContainerDied","Data":"74d3807a769ed310c5587ccc582fb3e9995e17be47d553253201a754419e3f28"} Apr 16 22:24:07.816457 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:07.816381 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:24:09.515345 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:09.515307 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 16 22:24:12.820385 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:12.820356 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:24:12.820972 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:12.820944 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:24:14.515752 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:14.515711 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 16 22:24:14.516136 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:14.515857 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:24:15.518628 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:15.518588 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:24:15.519010 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:15.518901 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:24:19.515990 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:19.515946 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 16 22:24:22.821507 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:22.821473 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:24:24.516170 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:24.516130 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 16 22:24:25.519404 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:25.519363 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 22:24:25.519853 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:25.519502 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:24:25.519853 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:25.519722 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:24:25.519853 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:25.519844 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:24:29.515758 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.515714 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 16 22:24:29.886621 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.886594 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:24:29.893251 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.893227 2571 generic.go:358] "Generic (PLEG): container finished" podID="0c42a725-a720-4322-b803-0353062f18fb" containerID="b7f8404e53c773dbd984fddcc2b6c33d1b1ac2a2de25a914d67774ee7e2edc80" exitCode=0 Apr 16 22:24:29.893374 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.893277 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" event={"ID":"0c42a725-a720-4322-b803-0353062f18fb","Type":"ContainerDied","Data":"b7f8404e53c773dbd984fddcc2b6c33d1b1ac2a2de25a914d67774ee7e2edc80"} Apr 16 22:24:29.893374 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.893305 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" Apr 16 22:24:29.893374 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.893321 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq" event={"ID":"0c42a725-a720-4322-b803-0353062f18fb","Type":"ContainerDied","Data":"5bcc26aa75735402ae07dee281298a0164fbc389dc29c5f75f25b7c8e626078b"} Apr 16 22:24:29.893374 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.893347 2571 scope.go:117] "RemoveContainer" containerID="b7f8404e53c773dbd984fddcc2b6c33d1b1ac2a2de25a914d67774ee7e2edc80" Apr 16 22:24:29.903167 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.903150 2571 scope.go:117] "RemoveContainer" containerID="1341eacb50d021bdb8180a7b927c91a4ffc6c71cb7b463f8d087dccdb9c8cac8" Apr 16 22:24:29.910291 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.910275 2571 scope.go:117] "RemoveContainer" containerID="74d3807a769ed310c5587ccc582fb3e9995e17be47d553253201a754419e3f28" Apr 16 22:24:29.917251 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.917232 2571 scope.go:117] "RemoveContainer" containerID="8e222d428260af826190031d41d11e89130bcca51a5ac0e833d14b9af6b9777d" Apr 16 22:24:29.926054 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.926026 2571 scope.go:117] "RemoveContainer" containerID="b7f8404e53c773dbd984fddcc2b6c33d1b1ac2a2de25a914d67774ee7e2edc80" Apr 16 22:24:29.926324 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:24:29.926300 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7f8404e53c773dbd984fddcc2b6c33d1b1ac2a2de25a914d67774ee7e2edc80\": container with ID starting with b7f8404e53c773dbd984fddcc2b6c33d1b1ac2a2de25a914d67774ee7e2edc80 not found: ID does not exist" containerID="b7f8404e53c773dbd984fddcc2b6c33d1b1ac2a2de25a914d67774ee7e2edc80" Apr 16 22:24:29.926407 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.926331 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7f8404e53c773dbd984fddcc2b6c33d1b1ac2a2de25a914d67774ee7e2edc80"} err="failed to get container status \"b7f8404e53c773dbd984fddcc2b6c33d1b1ac2a2de25a914d67774ee7e2edc80\": rpc error: code = NotFound desc = could not find container \"b7f8404e53c773dbd984fddcc2b6c33d1b1ac2a2de25a914d67774ee7e2edc80\": container with ID starting with b7f8404e53c773dbd984fddcc2b6c33d1b1ac2a2de25a914d67774ee7e2edc80 not found: ID does not exist" Apr 16 22:24:29.926407 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.926358 2571 scope.go:117] "RemoveContainer" containerID="1341eacb50d021bdb8180a7b927c91a4ffc6c71cb7b463f8d087dccdb9c8cac8" Apr 16 22:24:29.926632 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:24:29.926614 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1341eacb50d021bdb8180a7b927c91a4ffc6c71cb7b463f8d087dccdb9c8cac8\": container with ID starting with 1341eacb50d021bdb8180a7b927c91a4ffc6c71cb7b463f8d087dccdb9c8cac8 not found: ID does not exist" containerID="1341eacb50d021bdb8180a7b927c91a4ffc6c71cb7b463f8d087dccdb9c8cac8" Apr 16 22:24:29.926680 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.926639 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1341eacb50d021bdb8180a7b927c91a4ffc6c71cb7b463f8d087dccdb9c8cac8"} err="failed to get container status \"1341eacb50d021bdb8180a7b927c91a4ffc6c71cb7b463f8d087dccdb9c8cac8\": rpc error: code = NotFound desc = could not find container \"1341eacb50d021bdb8180a7b927c91a4ffc6c71cb7b463f8d087dccdb9c8cac8\": container with ID starting with 1341eacb50d021bdb8180a7b927c91a4ffc6c71cb7b463f8d087dccdb9c8cac8 not found: ID does not exist" Apr 16 22:24:29.926680 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.926673 2571 scope.go:117] "RemoveContainer" containerID="74d3807a769ed310c5587ccc582fb3e9995e17be47d553253201a754419e3f28" Apr 16 22:24:29.926905 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:24:29.926882 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d3807a769ed310c5587ccc582fb3e9995e17be47d553253201a754419e3f28\": container with ID starting with 74d3807a769ed310c5587ccc582fb3e9995e17be47d553253201a754419e3f28 not found: ID does not exist" containerID="74d3807a769ed310c5587ccc582fb3e9995e17be47d553253201a754419e3f28" Apr 16 22:24:29.926999 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.926911 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d3807a769ed310c5587ccc582fb3e9995e17be47d553253201a754419e3f28"} err="failed to get container status \"74d3807a769ed310c5587ccc582fb3e9995e17be47d553253201a754419e3f28\": rpc error: code = NotFound desc = could not find container \"74d3807a769ed310c5587ccc582fb3e9995e17be47d553253201a754419e3f28\": container with ID starting with 74d3807a769ed310c5587ccc582fb3e9995e17be47d553253201a754419e3f28 not found: ID does not exist" Apr 16 22:24:29.926999 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.926928 2571 scope.go:117] "RemoveContainer" containerID="8e222d428260af826190031d41d11e89130bcca51a5ac0e833d14b9af6b9777d" Apr 16 22:24:29.927173 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:24:29.927157 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e222d428260af826190031d41d11e89130bcca51a5ac0e833d14b9af6b9777d\": container with ID starting with 8e222d428260af826190031d41d11e89130bcca51a5ac0e833d14b9af6b9777d not found: ID does not exist" containerID="8e222d428260af826190031d41d11e89130bcca51a5ac0e833d14b9af6b9777d" Apr 16 22:24:29.927231 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.927191 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e222d428260af826190031d41d11e89130bcca51a5ac0e833d14b9af6b9777d"} err="failed to get container status \"8e222d428260af826190031d41d11e89130bcca51a5ac0e833d14b9af6b9777d\": rpc error: code = NotFound desc = could not find container \"8e222d428260af826190031d41d11e89130bcca51a5ac0e833d14b9af6b9777d\": container with ID starting with 8e222d428260af826190031d41d11e89130bcca51a5ac0e833d14b9af6b9777d not found: ID does not exist" Apr 16 22:24:29.969647 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.969617 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s7nh\" (UniqueName: \"kubernetes.io/projected/0c42a725-a720-4322-b803-0353062f18fb-kube-api-access-2s7nh\") pod \"0c42a725-a720-4322-b803-0353062f18fb\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " Apr 16 22:24:29.969786 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.969672 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c42a725-a720-4322-b803-0353062f18fb-proxy-tls\") pod \"0c42a725-a720-4322-b803-0353062f18fb\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " Apr 16 22:24:29.969786 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.969695 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c42a725-a720-4322-b803-0353062f18fb-kserve-provision-location\") pod \"0c42a725-a720-4322-b803-0353062f18fb\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " Apr 16 22:24:29.969894 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.969821 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c42a725-a720-4322-b803-0353062f18fb-isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config\") pod \"0c42a725-a720-4322-b803-0353062f18fb\" (UID: \"0c42a725-a720-4322-b803-0353062f18fb\") " Apr 16 22:24:29.969982 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.969960 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c42a725-a720-4322-b803-0353062f18fb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0c42a725-a720-4322-b803-0353062f18fb" (UID: "0c42a725-a720-4322-b803-0353062f18fb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:24:29.970118 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.970103 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c42a725-a720-4322-b803-0353062f18fb-kserve-provision-location\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:24:29.970180 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.970120 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c42a725-a720-4322-b803-0353062f18fb-isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config") pod "0c42a725-a720-4322-b803-0353062f18fb" (UID: "0c42a725-a720-4322-b803-0353062f18fb"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:24:29.971775 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.971757 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c42a725-a720-4322-b803-0353062f18fb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0c42a725-a720-4322-b803-0353062f18fb" (UID: "0c42a725-a720-4322-b803-0353062f18fb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:24:29.971861 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:29.971848 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c42a725-a720-4322-b803-0353062f18fb-kube-api-access-2s7nh" (OuterVolumeSpecName: "kube-api-access-2s7nh") pod "0c42a725-a720-4322-b803-0353062f18fb" (UID: "0c42a725-a720-4322-b803-0353062f18fb"). InnerVolumeSpecName "kube-api-access-2s7nh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:24:30.070545 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:30.070475 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c42a725-a720-4322-b803-0353062f18fb-proxy-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:24:30.070545 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:30.070502 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c42a725-a720-4322-b803-0353062f18fb-isvc-raw-sklearn-batcher-c8805-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:24:30.070545 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:30.070514 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2s7nh\" (UniqueName: \"kubernetes.io/projected/0c42a725-a720-4322-b803-0353062f18fb-kube-api-access-2s7nh\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:24:30.215144 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:30.215113 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq"] Apr 16 22:24:30.219404 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:30.219378 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-c8805-predictor-64549f69f7-dljkq"] Apr 16 22:24:31.679361 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:31.679324 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c42a725-a720-4322-b803-0353062f18fb" path="/var/lib/kubelet/pods/0c42a725-a720-4322-b803-0353062f18fb/volumes" Apr 16 22:24:32.820906 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:32.820868 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:24:42.821103 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:42.821056 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:24:52.821247 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:24:52.821212 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:25:02.821163 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:02.821124 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:25:12.821660 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:12.821627 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:25:50.089894 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.089791 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h"] Apr 16 22:25:50.090419 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.090168 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kserve-container" containerID="cri-o://3ebd2d1c388e790ea63d9801ddb39b008f898a2a0d22e7c40edbaad45e0435bc" gracePeriod=30 Apr 16 22:25:50.090488 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.090378 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kube-rbac-proxy" containerID="cri-o://0df9cd068ff40da269bff1286fac652a1e49b2d6b497f9eacd0e79a3c62f6dfe" gracePeriod=30 Apr 16 22:25:50.143598 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.143563 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg"] Apr 16 22:25:50.144160 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.144142 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" Apr 16 22:25:50.144207 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.144166 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" Apr 16 22:25:50.144207 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.144196 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="storage-initializer" Apr 16 22:25:50.144207 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.144205 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="storage-initializer" Apr 16 22:25:50.144295 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.144216 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" Apr 16 22:25:50.144295 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.144226 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" Apr 16 22:25:50.144295 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.144240 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kube-rbac-proxy" Apr 16 22:25:50.144295 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.144249 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kube-rbac-proxy" Apr 16 22:25:50.144409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.144338 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kube-rbac-proxy" Apr 16 22:25:50.144409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.144353 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="agent" Apr 16 22:25:50.144409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.144366 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c42a725-a720-4322-b803-0353062f18fb" containerName="kserve-container" Apr 16 22:25:50.148127 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.148102 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:50.150492 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.150467 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\"" Apr 16 22:25:50.150811 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.150472 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-23b0a-predictor-serving-cert\"" Apr 16 22:25:50.159711 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.159689 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg"] Apr 16 22:25:50.324964 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.324927 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:50.325147 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.324978 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:50.325147 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.325082 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-kube-api-access-5tbqv\") pod \"isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:50.325234 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.325164 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:50.425946 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.425867 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-kube-api-access-5tbqv\") pod \"isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:50.425946 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.425941 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:50.426140 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.425988 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:50.426140 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.426025 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:50.426140 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:25:50.426103 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-serving-cert: secret "isvc-sklearn-graph-raw-hpa-23b0a-predictor-serving-cert" not found Apr 16 22:25:50.426260 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:25:50.426168 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-proxy-tls podName:c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1 nodeName:}" failed. No retries permitted until 2026-04-16 22:25:50.926149791 +0000 UTC m=+733.857474634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-proxy-tls") pod "isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" (UID: "c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1") : secret "isvc-sklearn-graph-raw-hpa-23b0a-predictor-serving-cert" not found Apr 16 22:25:50.426481 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.426465 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:50.426623 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.426606 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:50.436119 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.436097 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-kube-api-access-5tbqv\") pod \"isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:50.930925 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.930894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:50.933257 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:50.933229 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:51.061353 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:51.061319 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:51.170248 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:51.170215 2571 generic.go:358] "Generic (PLEG): container finished" podID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerID="0df9cd068ff40da269bff1286fac652a1e49b2d6b497f9eacd0e79a3c62f6dfe" exitCode=2 Apr 16 22:25:51.170611 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:51.170289 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" event={"ID":"59a7617f-457d-47bc-b9cc-7a633ccb9c3c","Type":"ContainerDied","Data":"0df9cd068ff40da269bff1286fac652a1e49b2d6b497f9eacd0e79a3c62f6dfe"} Apr 16 22:25:51.184573 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:51.184530 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg"] Apr 16 22:25:51.186705 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:25:51.186681 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3ee17f7_0b70_4683_bb3c_4f5871d8e4f1.slice/crio-e89374ad33d38df1e91958a6ab4c5314bf483312571a7a4eec343374a824dbd6 WatchSource:0}: Error finding container e89374ad33d38df1e91958a6ab4c5314bf483312571a7a4eec343374a824dbd6: Status 404 returned error can't find the container with id e89374ad33d38df1e91958a6ab4c5314bf483312571a7a4eec343374a824dbd6 Apr 16 22:25:52.174496 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:52.174455 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" event={"ID":"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1","Type":"ContainerStarted","Data":"b01bc5decf564301c8cc35438778f889ec12d8b261e7c94c9074427fc45ba69e"} Apr 16 22:25:52.174496 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:52.174494 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" event={"ID":"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1","Type":"ContainerStarted","Data":"e89374ad33d38df1e91958a6ab4c5314bf483312571a7a4eec343374a824dbd6"} Apr 16 22:25:52.816645 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:52.816605 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 16 22:25:52.820965 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:52.820929 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 22:25:54.188331 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.188292 2571 generic.go:358] "Generic (PLEG): container finished" podID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerID="3ebd2d1c388e790ea63d9801ddb39b008f898a2a0d22e7c40edbaad45e0435bc" exitCode=0 Apr 16 22:25:54.188685 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.188361 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" event={"ID":"59a7617f-457d-47bc-b9cc-7a633ccb9c3c","Type":"ContainerDied","Data":"3ebd2d1c388e790ea63d9801ddb39b008f898a2a0d22e7c40edbaad45e0435bc"} Apr 16 22:25:54.236505 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.236480 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:25:54.362167 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.362088 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config\") pod \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " Apr 16 22:25:54.362167 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.362165 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-proxy-tls\") pod \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " Apr 16 22:25:54.362374 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.362189 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-kserve-provision-location\") pod \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " Apr 16 22:25:54.362374 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.362208 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g77p\" (UniqueName: \"kubernetes.io/projected/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-kube-api-access-5g77p\") pod \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\" (UID: \"59a7617f-457d-47bc-b9cc-7a633ccb9c3c\") " Apr 16 22:25:54.362520 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.362492 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config") pod "59a7617f-457d-47bc-b9cc-7a633ccb9c3c" (UID: "59a7617f-457d-47bc-b9cc-7a633ccb9c3c"). InnerVolumeSpecName "isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:25:54.362607 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.362519 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "59a7617f-457d-47bc-b9cc-7a633ccb9c3c" (UID: "59a7617f-457d-47bc-b9cc-7a633ccb9c3c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:25:54.364212 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.364188 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-kube-api-access-5g77p" (OuterVolumeSpecName: "kube-api-access-5g77p") pod "59a7617f-457d-47bc-b9cc-7a633ccb9c3c" (UID: "59a7617f-457d-47bc-b9cc-7a633ccb9c3c"). InnerVolumeSpecName "kube-api-access-5g77p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:25:54.364315 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.364250 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "59a7617f-457d-47bc-b9cc-7a633ccb9c3c" (UID: "59a7617f-457d-47bc-b9cc-7a633ccb9c3c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:25:54.462726 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.462686 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-proxy-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:25:54.462726 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.462722 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-kserve-provision-location\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:25:54.462726 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.462732 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5g77p\" (UniqueName: \"kubernetes.io/projected/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-kube-api-access-5g77p\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:25:54.462726 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:54.462741 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59a7617f-457d-47bc-b9cc-7a633ccb9c3c-isvc-sklearn-graph-raw-b6a6b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:25:55.193866 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:55.193835 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" Apr 16 22:25:55.193866 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:55.193845 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h" event={"ID":"59a7617f-457d-47bc-b9cc-7a633ccb9c3c","Type":"ContainerDied","Data":"efc880c77aafeed45cc279ee6170f9a13d20886f1bece518c692192e4a4f4d6a"} Apr 16 22:25:55.194391 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:55.193899 2571 scope.go:117] "RemoveContainer" containerID="0df9cd068ff40da269bff1286fac652a1e49b2d6b497f9eacd0e79a3c62f6dfe" Apr 16 22:25:55.195278 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:55.195260 2571 generic.go:358] "Generic (PLEG): container finished" podID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerID="b01bc5decf564301c8cc35438778f889ec12d8b261e7c94c9074427fc45ba69e" exitCode=0 Apr 16 22:25:55.195352 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:55.195316 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" event={"ID":"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1","Type":"ContainerDied","Data":"b01bc5decf564301c8cc35438778f889ec12d8b261e7c94c9074427fc45ba69e"} Apr 16 22:25:55.202304 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:55.202283 2571 scope.go:117] "RemoveContainer" containerID="3ebd2d1c388e790ea63d9801ddb39b008f898a2a0d22e7c40edbaad45e0435bc" Apr 16 22:25:55.209943 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:55.209925 2571 scope.go:117] "RemoveContainer" containerID="b7101605cb452e33024194a476207e3b5ccd869853b733a8019f7f52b19e27ce" Apr 16 22:25:55.236599 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:55.236574 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h"] Apr 16 22:25:55.242879 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:55.242854 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-b6a6b-predictor-54c78fbc4f-6zx2h"] Apr 16 22:25:55.680193 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:55.680115 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" path="/var/lib/kubelet/pods/59a7617f-457d-47bc-b9cc-7a633ccb9c3c/volumes" Apr 16 22:25:56.201455 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:56.201419 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" event={"ID":"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1","Type":"ContainerStarted","Data":"2cc34ba96a96aba5c3c14e133400e7669cfe4980280ef97b55df8a2a271bde1f"} Apr 16 22:25:56.201971 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:56.201468 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" event={"ID":"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1","Type":"ContainerStarted","Data":"9202ef166bc20b69c34828f06dfb60a707cd877376dd4d51fbaf6bb1773c99ed"} Apr 16 22:25:56.201971 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:56.201721 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:56.220088 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:56.220042 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" podStartSLOduration=6.220028747 podStartE2EDuration="6.220028747s" podCreationTimestamp="2026-04-16 22:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:25:56.218967215 +0000 UTC m=+739.150292074" watchObservedRunningTime="2026-04-16 22:25:56.220028747 +0000 UTC m=+739.151353607" Apr 16 22:25:57.205612 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:57.205576 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:25:57.206970 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:57.206940 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:25:58.209115 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:25:58.209073 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:26:03.213275 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:26:03.213248 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:26:03.213895 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:26:03.213866 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:26:13.214646 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:26:13.214611 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:26:23.214331 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:26:23.214294 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:26:33.213876 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:26:33.213832 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:26:43.214679 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:26:43.214637 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:26:53.213969 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:26:53.213928 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:27:03.214476 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:03.214447 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:27:30.402739 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:30.402694 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg"] Apr 16 22:27:30.403658 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:30.403612 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kserve-container" containerID="cri-o://9202ef166bc20b69c34828f06dfb60a707cd877376dd4d51fbaf6bb1773c99ed" gracePeriod=30 Apr 16 22:27:30.403747 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:30.403638 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kube-rbac-proxy" containerID="cri-o://2cc34ba96a96aba5c3c14e133400e7669cfe4980280ef97b55df8a2a271bde1f" gracePeriod=30 Apr 16 22:27:30.525633 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:30.525605 2571 generic.go:358] "Generic (PLEG): container finished" podID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerID="2cc34ba96a96aba5c3c14e133400e7669cfe4980280ef97b55df8a2a271bde1f" exitCode=2 Apr 16 22:27:30.525765 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:30.525678 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" event={"ID":"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1","Type":"ContainerDied","Data":"2cc34ba96a96aba5c3c14e133400e7669cfe4980280ef97b55df8a2a271bde1f"} Apr 16 22:27:33.209876 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:33.209834 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.39:8643/healthz\": dial tcp 10.132.0.39:8643: connect: connection refused" Apr 16 22:27:33.214211 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:33.214188 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 22:27:34.541387 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.541355 2571 generic.go:358] "Generic (PLEG): container finished" podID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerID="9202ef166bc20b69c34828f06dfb60a707cd877376dd4d51fbaf6bb1773c99ed" exitCode=0 Apr 16 22:27:34.541804 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.541400 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" event={"ID":"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1","Type":"ContainerDied","Data":"9202ef166bc20b69c34828f06dfb60a707cd877376dd4d51fbaf6bb1773c99ed"} Apr 16 22:27:34.560909 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.560887 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:27:34.697680 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.697596 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-kube-api-access-5tbqv\") pod \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " Apr 16 22:27:34.697825 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.697687 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-proxy-tls\") pod \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " Apr 16 22:27:34.697825 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.697788 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\") pod \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " Apr 16 22:27:34.697901 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.697831 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-kserve-provision-location\") pod \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\" (UID: \"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1\") " Apr 16 22:27:34.698152 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.698123 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" (UID: "c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:34.698228 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.698134 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config") pod "c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" (UID: "c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:27:34.699730 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.699702 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" (UID: "c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:27:34.699730 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.699711 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-kube-api-access-5tbqv" (OuterVolumeSpecName: "kube-api-access-5tbqv") pod "c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" (UID: "c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1"). InnerVolumeSpecName "kube-api-access-5tbqv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:27:34.799353 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.799320 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-isvc-sklearn-graph-raw-hpa-23b0a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:27:34.799353 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.799349 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-kserve-provision-location\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:27:34.799533 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.799367 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-kube-api-access-5tbqv\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:27:34.799533 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:34.799376 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1-proxy-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:27:35.546342 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:35.546317 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" Apr 16 22:27:35.546747 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:35.546311 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg" event={"ID":"c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1","Type":"ContainerDied","Data":"e89374ad33d38df1e91958a6ab4c5314bf483312571a7a4eec343374a824dbd6"} Apr 16 22:27:35.546747 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:35.546433 2571 scope.go:117] "RemoveContainer" containerID="2cc34ba96a96aba5c3c14e133400e7669cfe4980280ef97b55df8a2a271bde1f" Apr 16 22:27:35.555031 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:35.555017 2571 scope.go:117] "RemoveContainer" containerID="9202ef166bc20b69c34828f06dfb60a707cd877376dd4d51fbaf6bb1773c99ed" Apr 16 22:27:35.562043 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:35.562026 2571 scope.go:117] "RemoveContainer" containerID="b01bc5decf564301c8cc35438778f889ec12d8b261e7c94c9074427fc45ba69e" Apr 16 22:27:35.573819 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:35.573787 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg"] Apr 16 22:27:35.573991 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:35.573837 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-23b0a-predictor-5469f6f6b5-2nmfg"] Apr 16 22:27:35.678860 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:35.678817 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" path="/var/lib/kubelet/pods/c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1/volumes" Apr 16 22:27:50.462332 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.461796 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7"] Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465006 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="storage-initializer" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465036 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="storage-initializer" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465067 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kube-rbac-proxy" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465076 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kube-rbac-proxy" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465093 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kserve-container" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465103 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kserve-container" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465133 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="storage-initializer" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465142 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="storage-initializer" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465153 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kserve-container" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465161 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kserve-container" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465172 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kube-rbac-proxy" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465181 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kube-rbac-proxy" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465346 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kserve-container" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465367 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kserve-container" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465378 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="59a7617f-457d-47bc-b9cc-7a633ccb9c3c" containerName="kube-rbac-proxy" Apr 16 22:27:50.467219 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.465396 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3ee17f7-0b70-4683-bb3c-4f5871d8e4f1" containerName="kube-rbac-proxy" Apr 16 22:27:50.470331 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.470307 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:50.473073 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.473048 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:27:50.473322 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.473298 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:27:50.474049 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.473441 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-1bd3d-predictor-serving-cert\"" Apr 16 22:27:50.474049 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.473507 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gj7r7\"" Apr 16 22:27:50.474049 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.473693 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config\"" Apr 16 22:27:50.475369 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.475323 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7"] Apr 16 22:27:50.529775 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.529745 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:50.529921 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.529785 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-proxy-tls\") pod \"isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:50.529921 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.529868 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-kserve-provision-location\") pod \"isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:50.529921 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.529891 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gct4\" (UniqueName: \"kubernetes.io/projected/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-kube-api-access-2gct4\") pod \"isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:50.631019 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.630995 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:50.631178 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.631030 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-proxy-tls\") pod \"isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:50.631178 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.631064 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-kserve-provision-location\") pod \"isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:50.631178 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.631084 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gct4\" (UniqueName: \"kubernetes.io/projected/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-kube-api-access-2gct4\") pod \"isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:50.631347 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:27:50.631207 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-serving-cert: secret "isvc-logger-raw-1bd3d-predictor-serving-cert" not found Apr 16 22:27:50.631347 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:27:50.631286 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-proxy-tls podName:6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39 nodeName:}" failed. No retries permitted until 2026-04-16 22:27:51.131263655 +0000 UTC m=+854.062588506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-proxy-tls") pod "isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" (UID: "6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39") : secret "isvc-logger-raw-1bd3d-predictor-serving-cert" not found Apr 16 22:27:50.631500 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.631478 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-kserve-provision-location\") pod \"isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:50.631679 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.631661 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:50.639225 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:50.639205 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gct4\" (UniqueName: \"kubernetes.io/projected/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-kube-api-access-2gct4\") pod \"isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:51.134078 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:51.134026 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-proxy-tls\") pod \"isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:51.136477 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:51.136442 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-proxy-tls\") pod \"isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:51.382643 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:51.382607 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:51.710247 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:51.710221 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7"] Apr 16 22:27:51.712593 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:27:51.712564 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd07d0c_e1b5_4347_a1b2_d8a6fa2c7c39.slice/crio-cc652cfd669d556df78cb6c687ae17f717b3d477da0152177e6f747c294705bd WatchSource:0}: Error finding container cc652cfd669d556df78cb6c687ae17f717b3d477da0152177e6f747c294705bd: Status 404 returned error can't find the container with id cc652cfd669d556df78cb6c687ae17f717b3d477da0152177e6f747c294705bd Apr 16 22:27:52.608110 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:52.608075 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" event={"ID":"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39","Type":"ContainerStarted","Data":"5d7987f7958ec52ae73246d2df63d6029d98d834f15c64f9bbd881ba8da848c8"} Apr 16 22:27:52.608110 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:52.608114 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" event={"ID":"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39","Type":"ContainerStarted","Data":"cc652cfd669d556df78cb6c687ae17f717b3d477da0152177e6f747c294705bd"} Apr 16 22:27:55.620374 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:55.620292 2571 generic.go:358] "Generic (PLEG): container finished" podID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerID="5d7987f7958ec52ae73246d2df63d6029d98d834f15c64f9bbd881ba8da848c8" exitCode=0 Apr 16 22:27:55.620732 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:55.620365 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" event={"ID":"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39","Type":"ContainerDied","Data":"5d7987f7958ec52ae73246d2df63d6029d98d834f15c64f9bbd881ba8da848c8"} Apr 16 22:27:56.625914 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:56.625874 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" event={"ID":"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39","Type":"ContainerStarted","Data":"179babeb67867c2801c7d7743593a2267f691e1b0afe142b353dbf478d9a5561"} Apr 16 22:27:56.625914 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:56.625911 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" event={"ID":"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39","Type":"ContainerStarted","Data":"9893c3049115a85923ba992dead7f13c88898f64e2ad72fde8b74b1e94293421"} Apr 16 22:27:56.625914 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:56.625921 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" event={"ID":"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39","Type":"ContainerStarted","Data":"8305a8b9ad3f730b332e525b521e4fcf1ba54a2ff7fc23200d9ea05f992a435e"} Apr 16 22:27:56.626332 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:56.626218 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:56.626369 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:56.626347 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:56.627409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:56.627387 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 22:27:56.645843 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:56.645802 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podStartSLOduration=6.645791798 podStartE2EDuration="6.645791798s" podCreationTimestamp="2026-04-16 22:27:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:27:56.644838525 +0000 UTC m=+859.576163390" watchObservedRunningTime="2026-04-16 22:27:56.645791798 +0000 UTC m=+859.577116659" Apr 16 22:27:57.629847 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:57.629795 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 22:27:57.629847 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:57.629838 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:27:57.630795 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:57.630763 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:27:58.632701 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:58.632661 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 22:27:58.633156 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:27:58.633119 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:28:03.636607 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:03.636568 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:28:03.637132 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:03.637092 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 22:28:03.637608 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:03.637585 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:28:13.637063 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:13.637016 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 22:28:13.637518 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:13.637453 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:28:23.637699 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:23.637661 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 22:28:23.638226 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:23.638201 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:28:33.637573 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:33.637463 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 22:28:33.638112 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:33.637936 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:28:37.595997 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:37.595970 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/1.log" Apr 16 22:28:37.598936 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:37.598910 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/1.log" Apr 16 22:28:43.637879 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:43.637820 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 22:28:43.638319 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:43.638275 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:28:53.637057 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:53.637011 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 22:28:53.637521 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:28:53.637495 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:29:03.637761 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:03.637722 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:29:03.640195 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:03.638228 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:29:15.681312 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.681269 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7"] Apr 16 22:29:15.681933 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.681716 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" containerID="cri-o://8305a8b9ad3f730b332e525b521e4fcf1ba54a2ff7fc23200d9ea05f992a435e" gracePeriod=30 Apr 16 22:29:15.681933 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.681756 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" containerID="cri-o://179babeb67867c2801c7d7743593a2267f691e1b0afe142b353dbf478d9a5561" gracePeriod=30 Apr 16 22:29:15.681933 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.681839 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kube-rbac-proxy" containerID="cri-o://9893c3049115a85923ba992dead7f13c88898f64e2ad72fde8b74b1e94293421" gracePeriod=30 Apr 16 22:29:15.724159 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.724120 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f"] Apr 16 22:29:15.728363 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.728339 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:15.730847 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.730822 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-d1ab0-predictor-serving-cert\"" Apr 16 22:29:15.730957 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.730822 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config\"" Apr 16 22:29:15.737009 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.736764 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f"] Apr 16 22:29:15.844046 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.844005 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vrd4\" (UniqueName: \"kubernetes.io/projected/20aaa506-0e1b-4919-81cc-ff53ebe71f48-kube-api-access-2vrd4\") pod \"isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:15.844267 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.844134 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20aaa506-0e1b-4919-81cc-ff53ebe71f48-proxy-tls\") pod \"isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:15.844267 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.844171 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/20aaa506-0e1b-4919-81cc-ff53ebe71f48-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:15.844267 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.844216 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/20aaa506-0e1b-4919-81cc-ff53ebe71f48-isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:15.907471 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.907436 2571 generic.go:358] "Generic (PLEG): container finished" podID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerID="9893c3049115a85923ba992dead7f13c88898f64e2ad72fde8b74b1e94293421" exitCode=2 Apr 16 22:29:15.907665 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.907507 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" event={"ID":"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39","Type":"ContainerDied","Data":"9893c3049115a85923ba992dead7f13c88898f64e2ad72fde8b74b1e94293421"} Apr 16 22:29:15.945619 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.945498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vrd4\" (UniqueName: \"kubernetes.io/projected/20aaa506-0e1b-4919-81cc-ff53ebe71f48-kube-api-access-2vrd4\") pod \"isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:15.945619 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.945609 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20aaa506-0e1b-4919-81cc-ff53ebe71f48-proxy-tls\") pod \"isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:15.945863 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.945634 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/20aaa506-0e1b-4919-81cc-ff53ebe71f48-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:15.945863 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.945661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/20aaa506-0e1b-4919-81cc-ff53ebe71f48-isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:15.946098 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.946072 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/20aaa506-0e1b-4919-81cc-ff53ebe71f48-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:15.946335 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.946315 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/20aaa506-0e1b-4919-81cc-ff53ebe71f48-isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:15.948180 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.948159 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20aaa506-0e1b-4919-81cc-ff53ebe71f48-proxy-tls\") pod \"isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:15.953974 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:15.953950 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vrd4\" (UniqueName: \"kubernetes.io/projected/20aaa506-0e1b-4919-81cc-ff53ebe71f48-kube-api-access-2vrd4\") pod \"isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:16.041533 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:16.041490 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:16.172888 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:16.172857 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f"] Apr 16 22:29:16.175437 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:29:16.175406 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20aaa506_0e1b_4919_81cc_ff53ebe71f48.slice/crio-7fb13a42835915b2001de986f94d7452fcead6b586b43fd4f6d236fdb2311254 WatchSource:0}: Error finding container 7fb13a42835915b2001de986f94d7452fcead6b586b43fd4f6d236fdb2311254: Status 404 returned error can't find the container with id 7fb13a42835915b2001de986f94d7452fcead6b586b43fd4f6d236fdb2311254 Apr 16 22:29:16.177252 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:16.177233 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:29:16.912256 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:16.912197 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" event={"ID":"20aaa506-0e1b-4919-81cc-ff53ebe71f48","Type":"ContainerStarted","Data":"3904e5e7d11734549993f00f1c2d41f09aeca43e07215d64964088651af5c1e3"} Apr 16 22:29:16.912256 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:16.912242 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" event={"ID":"20aaa506-0e1b-4919-81cc-ff53ebe71f48","Type":"ContainerStarted","Data":"7fb13a42835915b2001de986f94d7452fcead6b586b43fd4f6d236fdb2311254"} Apr 16 22:29:18.633012 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:18.632965 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 16 22:29:19.926412 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:19.926378 2571 generic.go:358] "Generic (PLEG): container finished" podID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerID="3904e5e7d11734549993f00f1c2d41f09aeca43e07215d64964088651af5c1e3" exitCode=0 Apr 16 22:29:19.926845 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:19.926455 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" event={"ID":"20aaa506-0e1b-4919-81cc-ff53ebe71f48","Type":"ContainerDied","Data":"3904e5e7d11734549993f00f1c2d41f09aeca43e07215d64964088651af5c1e3"} Apr 16 22:29:20.932394 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:20.932357 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" event={"ID":"20aaa506-0e1b-4919-81cc-ff53ebe71f48","Type":"ContainerStarted","Data":"49214d7b6856257318afa5fa68fe0df16046c427a6ae7f3e20804fb12081922f"} Apr 16 22:29:20.932899 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:20.932404 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" event={"ID":"20aaa506-0e1b-4919-81cc-ff53ebe71f48","Type":"ContainerStarted","Data":"f75b80744a27e46a6bf90f8d2d65c4e296729e1fe8e00bc1adb0c16a71c8da71"} Apr 16 22:29:20.932899 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:20.932682 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:20.934618 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:20.934591 2571 generic.go:358] "Generic (PLEG): container finished" podID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerID="8305a8b9ad3f730b332e525b521e4fcf1ba54a2ff7fc23200d9ea05f992a435e" exitCode=0 Apr 16 22:29:20.934730 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:20.934629 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" event={"ID":"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39","Type":"ContainerDied","Data":"8305a8b9ad3f730b332e525b521e4fcf1ba54a2ff7fc23200d9ea05f992a435e"} Apr 16 22:29:20.951057 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:20.951000 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podStartSLOduration=5.950982113 podStartE2EDuration="5.950982113s" podCreationTimestamp="2026-04-16 22:29:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:29:20.949195259 +0000 UTC m=+943.880520122" watchObservedRunningTime="2026-04-16 22:29:20.950982113 +0000 UTC m=+943.882306976" Apr 16 22:29:21.937518 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:21.937486 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:21.939049 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:21.939021 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:29:22.941031 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:22.940995 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:29:23.633799 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:23.633754 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 16 22:29:23.638085 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:23.638052 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 22:29:23.638794 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:23.638767 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:29:27.945355 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:27.945317 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:29:27.945973 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:27.945941 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:29:28.633846 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:28.633792 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 16 22:29:28.634059 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:28.633968 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:29:33.633647 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:33.633597 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 16 22:29:33.637293 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:33.637263 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 22:29:33.638909 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:33.638883 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:29:37.946236 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:37.946197 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:29:38.633786 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:38.633731 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 16 22:29:43.633592 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:43.633518 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 16 22:29:43.637940 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:43.637906 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 22:29:43.638085 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:43.638051 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:29:43.638836 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:43.638806 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:29:43.638936 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:43.638908 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:29:45.841233 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:45.841207 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:29:45.906109 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:45.906075 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-kserve-provision-location\") pod \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " Apr 16 22:29:45.906303 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:45.906121 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-proxy-tls\") pod \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " Apr 16 22:29:45.906303 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:45.906144 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config\") pod \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " Apr 16 22:29:45.906303 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:45.906173 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gct4\" (UniqueName: \"kubernetes.io/projected/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-kube-api-access-2gct4\") pod \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\" (UID: \"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39\") " Apr 16 22:29:45.906489 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:45.906450 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" (UID: "6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:29:45.906614 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:45.906591 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config") pod "6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" (UID: "6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39"). InnerVolumeSpecName "isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:29:45.908216 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:45.908191 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" (UID: "6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:29:45.908319 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:45.908222 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-kube-api-access-2gct4" (OuterVolumeSpecName: "kube-api-access-2gct4") pod "6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" (UID: "6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39"). InnerVolumeSpecName "kube-api-access-2gct4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:29:46.007172 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.007145 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2gct4\" (UniqueName: \"kubernetes.io/projected/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-kube-api-access-2gct4\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:29:46.007172 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.007171 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-kserve-provision-location\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:29:46.007422 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.007181 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-proxy-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:29:46.007422 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.007192 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39-isvc-logger-raw-1bd3d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:29:46.032344 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.032308 2571 generic.go:358] "Generic (PLEG): container finished" podID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerID="179babeb67867c2801c7d7743593a2267f691e1b0afe142b353dbf478d9a5561" exitCode=0 Apr 16 22:29:46.032509 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.032390 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" event={"ID":"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39","Type":"ContainerDied","Data":"179babeb67867c2801c7d7743593a2267f691e1b0afe142b353dbf478d9a5561"} Apr 16 22:29:46.032509 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.032430 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" event={"ID":"6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39","Type":"ContainerDied","Data":"cc652cfd669d556df78cb6c687ae17f717b3d477da0152177e6f747c294705bd"} Apr 16 22:29:46.032509 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.032440 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7" Apr 16 22:29:46.032662 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.032448 2571 scope.go:117] "RemoveContainer" containerID="179babeb67867c2801c7d7743593a2267f691e1b0afe142b353dbf478d9a5561" Apr 16 22:29:46.041028 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.041010 2571 scope.go:117] "RemoveContainer" containerID="9893c3049115a85923ba992dead7f13c88898f64e2ad72fde8b74b1e94293421" Apr 16 22:29:46.049404 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.049381 2571 scope.go:117] "RemoveContainer" containerID="8305a8b9ad3f730b332e525b521e4fcf1ba54a2ff7fc23200d9ea05f992a435e" Apr 16 22:29:46.057189 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.057160 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7"] Apr 16 22:29:46.058190 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.058173 2571 scope.go:117] "RemoveContainer" containerID="5d7987f7958ec52ae73246d2df63d6029d98d834f15c64f9bbd881ba8da848c8" Apr 16 22:29:46.062173 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.062146 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-1bd3d-predictor-774f6dc856-dnwm7"] Apr 16 22:29:46.066494 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.066477 2571 scope.go:117] "RemoveContainer" containerID="179babeb67867c2801c7d7743593a2267f691e1b0afe142b353dbf478d9a5561" Apr 16 22:29:46.066752 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:29:46.066736 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"179babeb67867c2801c7d7743593a2267f691e1b0afe142b353dbf478d9a5561\": container with ID starting with 179babeb67867c2801c7d7743593a2267f691e1b0afe142b353dbf478d9a5561 not found: ID does not exist" containerID="179babeb67867c2801c7d7743593a2267f691e1b0afe142b353dbf478d9a5561" Apr 16 22:29:46.066802 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.066761 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179babeb67867c2801c7d7743593a2267f691e1b0afe142b353dbf478d9a5561"} err="failed to get container status \"179babeb67867c2801c7d7743593a2267f691e1b0afe142b353dbf478d9a5561\": rpc error: code = NotFound desc = could not find container \"179babeb67867c2801c7d7743593a2267f691e1b0afe142b353dbf478d9a5561\": container with ID starting with 179babeb67867c2801c7d7743593a2267f691e1b0afe142b353dbf478d9a5561 not found: ID does not exist" Apr 16 22:29:46.066802 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.066780 2571 scope.go:117] "RemoveContainer" containerID="9893c3049115a85923ba992dead7f13c88898f64e2ad72fde8b74b1e94293421" Apr 16 22:29:46.067003 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:29:46.066987 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9893c3049115a85923ba992dead7f13c88898f64e2ad72fde8b74b1e94293421\": container with ID starting with 9893c3049115a85923ba992dead7f13c88898f64e2ad72fde8b74b1e94293421 not found: ID does not exist" containerID="9893c3049115a85923ba992dead7f13c88898f64e2ad72fde8b74b1e94293421" Apr 16 22:29:46.067048 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.067007 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9893c3049115a85923ba992dead7f13c88898f64e2ad72fde8b74b1e94293421"} err="failed to get container status \"9893c3049115a85923ba992dead7f13c88898f64e2ad72fde8b74b1e94293421\": rpc error: code = NotFound desc = could not find container \"9893c3049115a85923ba992dead7f13c88898f64e2ad72fde8b74b1e94293421\": container with ID starting with 9893c3049115a85923ba992dead7f13c88898f64e2ad72fde8b74b1e94293421 not found: ID does not exist" Apr 16 22:29:46.067048 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.067020 2571 scope.go:117] "RemoveContainer" containerID="8305a8b9ad3f730b332e525b521e4fcf1ba54a2ff7fc23200d9ea05f992a435e" Apr 16 22:29:46.067209 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:29:46.067191 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8305a8b9ad3f730b332e525b521e4fcf1ba54a2ff7fc23200d9ea05f992a435e\": container with ID starting with 8305a8b9ad3f730b332e525b521e4fcf1ba54a2ff7fc23200d9ea05f992a435e not found: ID does not exist" containerID="8305a8b9ad3f730b332e525b521e4fcf1ba54a2ff7fc23200d9ea05f992a435e" Apr 16 22:29:46.067251 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.067212 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8305a8b9ad3f730b332e525b521e4fcf1ba54a2ff7fc23200d9ea05f992a435e"} err="failed to get container status \"8305a8b9ad3f730b332e525b521e4fcf1ba54a2ff7fc23200d9ea05f992a435e\": rpc error: code = NotFound desc = could not find container \"8305a8b9ad3f730b332e525b521e4fcf1ba54a2ff7fc23200d9ea05f992a435e\": container with ID starting with 8305a8b9ad3f730b332e525b521e4fcf1ba54a2ff7fc23200d9ea05f992a435e not found: ID does not exist" Apr 16 22:29:46.067251 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.067223 2571 scope.go:117] "RemoveContainer" containerID="5d7987f7958ec52ae73246d2df63d6029d98d834f15c64f9bbd881ba8da848c8" Apr 16 22:29:46.067469 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:29:46.067449 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d7987f7958ec52ae73246d2df63d6029d98d834f15c64f9bbd881ba8da848c8\": container with ID starting with 5d7987f7958ec52ae73246d2df63d6029d98d834f15c64f9bbd881ba8da848c8 not found: ID does not exist" containerID="5d7987f7958ec52ae73246d2df63d6029d98d834f15c64f9bbd881ba8da848c8" Apr 16 22:29:46.067512 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:46.067476 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7987f7958ec52ae73246d2df63d6029d98d834f15c64f9bbd881ba8da848c8"} err="failed to get container status \"5d7987f7958ec52ae73246d2df63d6029d98d834f15c64f9bbd881ba8da848c8\": rpc error: code = NotFound desc = could not find container \"5d7987f7958ec52ae73246d2df63d6029d98d834f15c64f9bbd881ba8da848c8\": container with ID starting with 5d7987f7958ec52ae73246d2df63d6029d98d834f15c64f9bbd881ba8da848c8 not found: ID does not exist" Apr 16 22:29:47.678878 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:47.678835 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" path="/var/lib/kubelet/pods/6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39/volumes" Apr 16 22:29:47.946346 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:47.946254 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:29:57.946879 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:29:57.946785 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:30:07.946673 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:30:07.946631 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:30:17.946797 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:30:17.946753 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:30:27.946811 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:30:27.946768 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:30:37.945943 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:30:37.945846 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:30:47.946740 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:30:47.946702 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:30:56.674674 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:30:56.674614 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:31:06.674912 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:06.674855 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:31:16.675254 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:16.675200 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:31:26.674832 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:26.674737 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:31:36.675750 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:36.675716 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:31:45.915909 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:45.915868 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f"] Apr 16 22:31:45.916726 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:45.916692 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" containerID="cri-o://f75b80744a27e46a6bf90f8d2d65c4e296729e1fe8e00bc1adb0c16a71c8da71" gracePeriod=30 Apr 16 22:31:45.916987 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:45.916735 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kube-rbac-proxy" containerID="cri-o://49214d7b6856257318afa5fa68fe0df16046c427a6ae7f3e20804fb12081922f" gracePeriod=30 Apr 16 22:31:46.024743 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.024708 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s"] Apr 16 22:31:46.025303 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.025283 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kube-rbac-proxy" Apr 16 22:31:46.025355 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.025310 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kube-rbac-proxy" Apr 16 22:31:46.025391 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.025352 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="storage-initializer" Apr 16 22:31:46.025391 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.025362 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="storage-initializer" Apr 16 22:31:46.025391 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.025374 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" Apr 16 22:31:46.025391 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.025385 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" Apr 16 22:31:46.025518 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.025401 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" Apr 16 22:31:46.025518 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.025410 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" Apr 16 22:31:46.025518 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.025497 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kube-rbac-proxy" Apr 16 22:31:46.025634 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.025518 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="agent" Apr 16 22:31:46.025634 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.025530 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fd07d0c-e1b5-4347-a1b2-d8a6fa2c7c39" containerName="kserve-container" Apr 16 22:31:46.029186 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.029167 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.031972 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.031950 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-0da70d-predictor-serving-cert\"" Apr 16 22:31:46.032081 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.032024 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-0da70d-kube-rbac-proxy-sar-config\"" Apr 16 22:31:46.037293 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.036980 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s"] Apr 16 22:31:46.125622 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.125586 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c142c901-c0c6-429b-8b01-e6776240ae4a-proxy-tls\") pod \"isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.125790 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.125641 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c142c901-c0c6-429b-8b01-e6776240ae4a-kserve-provision-location\") pod \"isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.125790 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.125697 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-0da70d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c142c901-c0c6-429b-8b01-e6776240ae4a-isvc-primary-0da70d-kube-rbac-proxy-sar-config\") pod \"isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.125790 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.125717 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g94r\" (UniqueName: \"kubernetes.io/projected/c142c901-c0c6-429b-8b01-e6776240ae4a-kube-api-access-2g94r\") pod \"isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.226820 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.226738 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-0da70d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c142c901-c0c6-429b-8b01-e6776240ae4a-isvc-primary-0da70d-kube-rbac-proxy-sar-config\") pod \"isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.226820 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.226775 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2g94r\" (UniqueName: \"kubernetes.io/projected/c142c901-c0c6-429b-8b01-e6776240ae4a-kube-api-access-2g94r\") pod \"isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.226820 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.226815 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c142c901-c0c6-429b-8b01-e6776240ae4a-proxy-tls\") pod \"isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.227069 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.226849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c142c901-c0c6-429b-8b01-e6776240ae4a-kserve-provision-location\") pod \"isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.227069 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:31:46.226996 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-primary-0da70d-predictor-serving-cert: secret "isvc-primary-0da70d-predictor-serving-cert" not found Apr 16 22:31:46.227144 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:31:46.227080 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c142c901-c0c6-429b-8b01-e6776240ae4a-proxy-tls podName:c142c901-c0c6-429b-8b01-e6776240ae4a nodeName:}" failed. No retries permitted until 2026-04-16 22:31:46.727057642 +0000 UTC m=+1089.658382486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c142c901-c0c6-429b-8b01-e6776240ae4a-proxy-tls") pod "isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" (UID: "c142c901-c0c6-429b-8b01-e6776240ae4a") : secret "isvc-primary-0da70d-predictor-serving-cert" not found Apr 16 22:31:46.227263 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.227246 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c142c901-c0c6-429b-8b01-e6776240ae4a-kserve-provision-location\") pod \"isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.227573 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.227529 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-0da70d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c142c901-c0c6-429b-8b01-e6776240ae4a-isvc-primary-0da70d-kube-rbac-proxy-sar-config\") pod \"isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.237451 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.237427 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g94r\" (UniqueName: \"kubernetes.io/projected/c142c901-c0c6-429b-8b01-e6776240ae4a-kube-api-access-2g94r\") pod \"isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.446785 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.446750 2571 generic.go:358] "Generic (PLEG): container finished" podID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerID="49214d7b6856257318afa5fa68fe0df16046c427a6ae7f3e20804fb12081922f" exitCode=2 Apr 16 22:31:46.446785 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.446790 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" event={"ID":"20aaa506-0e1b-4919-81cc-ff53ebe71f48","Type":"ContainerDied","Data":"49214d7b6856257318afa5fa68fe0df16046c427a6ae7f3e20804fb12081922f"} Apr 16 22:31:46.674876 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.674833 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 22:31:46.730313 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.730278 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c142c901-c0c6-429b-8b01-e6776240ae4a-proxy-tls\") pod \"isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.732849 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.732822 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c142c901-c0c6-429b-8b01-e6776240ae4a-proxy-tls\") pod \"isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:46.941648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:46.941527 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:47.066054 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:47.066018 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s"] Apr 16 22:31:47.069670 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:31:47.069646 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc142c901_c0c6_429b_8b01_e6776240ae4a.slice/crio-32a352dd1c4dcb952b2b494497444d97720fe4201747993a004eb4ee6b820997 WatchSource:0}: Error finding container 32a352dd1c4dcb952b2b494497444d97720fe4201747993a004eb4ee6b820997: Status 404 returned error can't find the container with id 32a352dd1c4dcb952b2b494497444d97720fe4201747993a004eb4ee6b820997 Apr 16 22:31:47.451344 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:47.451305 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" event={"ID":"c142c901-c0c6-429b-8b01-e6776240ae4a","Type":"ContainerStarted","Data":"027c6e80b6a27609b2355536eb40bae8e7d297de9aba4812810103ee057695fd"} Apr 16 22:31:47.451344 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:47.451342 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" event={"ID":"c142c901-c0c6-429b-8b01-e6776240ae4a","Type":"ContainerStarted","Data":"32a352dd1c4dcb952b2b494497444d97720fe4201747993a004eb4ee6b820997"} Apr 16 22:31:47.941354 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:47.941309 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.41:8643/healthz\": dial tcp 10.132.0.41:8643: connect: connection refused" Apr 16 22:31:51.467206 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:51.467169 2571 generic.go:358] "Generic (PLEG): container finished" podID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerID="027c6e80b6a27609b2355536eb40bae8e7d297de9aba4812810103ee057695fd" exitCode=0 Apr 16 22:31:51.467707 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:51.467244 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" event={"ID":"c142c901-c0c6-429b-8b01-e6776240ae4a","Type":"ContainerDied","Data":"027c6e80b6a27609b2355536eb40bae8e7d297de9aba4812810103ee057695fd"} Apr 16 22:31:52.473133 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:52.473096 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" event={"ID":"c142c901-c0c6-429b-8b01-e6776240ae4a","Type":"ContainerStarted","Data":"d9c8e8eade7d15dce195ed8864c630166c6f5c95f222140c6c1e9ad91b6e68d0"} Apr 16 22:31:52.473133 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:52.473140 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" event={"ID":"c142c901-c0c6-429b-8b01-e6776240ae4a","Type":"ContainerStarted","Data":"7b2bc1e178bde512ee63f6eadeb98b8610ff6fdad59399775605610351663ecb"} Apr 16 22:31:52.473679 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:52.473435 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:52.473679 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:52.473578 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:52.475089 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:52.475062 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 22:31:52.493502 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:52.493445 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" podStartSLOduration=6.493429003 podStartE2EDuration="6.493429003s" podCreationTimestamp="2026-04-16 22:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:31:52.491295977 +0000 UTC m=+1095.422620863" watchObservedRunningTime="2026-04-16 22:31:52.493429003 +0000 UTC m=+1095.424753864" Apr 16 22:31:52.941746 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:52.941699 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.41:8643/healthz\": dial tcp 10.132.0.41:8643: connect: connection refused" Apr 16 22:31:53.476853 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:53.476810 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 22:31:56.663730 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:56.663704 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:31:56.717789 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:56.717692 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/20aaa506-0e1b-4919-81cc-ff53ebe71f48-isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config\") pod \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " Apr 16 22:31:56.717789 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:56.717772 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vrd4\" (UniqueName: \"kubernetes.io/projected/20aaa506-0e1b-4919-81cc-ff53ebe71f48-kube-api-access-2vrd4\") pod \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " Apr 16 22:31:56.718057 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:56.717863 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20aaa506-0e1b-4919-81cc-ff53ebe71f48-proxy-tls\") pod \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " Apr 16 22:31:56.718057 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:56.717958 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/20aaa506-0e1b-4919-81cc-ff53ebe71f48-kserve-provision-location\") pod \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\" (UID: \"20aaa506-0e1b-4919-81cc-ff53ebe71f48\") " Apr 16 22:31:56.718178 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:56.718122 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20aaa506-0e1b-4919-81cc-ff53ebe71f48-isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config") pod "20aaa506-0e1b-4919-81cc-ff53ebe71f48" (UID: "20aaa506-0e1b-4919-81cc-ff53ebe71f48"). InnerVolumeSpecName "isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:31:56.718245 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:56.718227 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/20aaa506-0e1b-4919-81cc-ff53ebe71f48-isvc-sklearn-scale-raw-d1ab0-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:31:56.718351 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:56.718331 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20aaa506-0e1b-4919-81cc-ff53ebe71f48-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "20aaa506-0e1b-4919-81cc-ff53ebe71f48" (UID: "20aaa506-0e1b-4919-81cc-ff53ebe71f48"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:31:56.719884 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:56.719858 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20aaa506-0e1b-4919-81cc-ff53ebe71f48-kube-api-access-2vrd4" (OuterVolumeSpecName: "kube-api-access-2vrd4") pod "20aaa506-0e1b-4919-81cc-ff53ebe71f48" (UID: "20aaa506-0e1b-4919-81cc-ff53ebe71f48"). InnerVolumeSpecName "kube-api-access-2vrd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:31:56.719991 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:56.719974 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20aaa506-0e1b-4919-81cc-ff53ebe71f48-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "20aaa506-0e1b-4919-81cc-ff53ebe71f48" (UID: "20aaa506-0e1b-4919-81cc-ff53ebe71f48"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:31:56.819156 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:56.819117 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/20aaa506-0e1b-4919-81cc-ff53ebe71f48-kserve-provision-location\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:31:56.819156 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:56.819151 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2vrd4\" (UniqueName: \"kubernetes.io/projected/20aaa506-0e1b-4919-81cc-ff53ebe71f48-kube-api-access-2vrd4\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:31:56.819156 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:56.819161 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/20aaa506-0e1b-4919-81cc-ff53ebe71f48-proxy-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:31:57.493340 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.493300 2571 generic.go:358] "Generic (PLEG): container finished" podID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerID="f75b80744a27e46a6bf90f8d2d65c4e296729e1fe8e00bc1adb0c16a71c8da71" exitCode=0 Apr 16 22:31:57.493340 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.493337 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" event={"ID":"20aaa506-0e1b-4919-81cc-ff53ebe71f48","Type":"ContainerDied","Data":"f75b80744a27e46a6bf90f8d2d65c4e296729e1fe8e00bc1adb0c16a71c8da71"} Apr 16 22:31:57.493594 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.493372 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" event={"ID":"20aaa506-0e1b-4919-81cc-ff53ebe71f48","Type":"ContainerDied","Data":"7fb13a42835915b2001de986f94d7452fcead6b586b43fd4f6d236fdb2311254"} Apr 16 22:31:57.493594 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.493388 2571 scope.go:117] "RemoveContainer" containerID="49214d7b6856257318afa5fa68fe0df16046c427a6ae7f3e20804fb12081922f" Apr 16 22:31:57.493594 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.493403 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f" Apr 16 22:31:57.502147 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.502129 2571 scope.go:117] "RemoveContainer" containerID="f75b80744a27e46a6bf90f8d2d65c4e296729e1fe8e00bc1adb0c16a71c8da71" Apr 16 22:31:57.509649 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.509631 2571 scope.go:117] "RemoveContainer" containerID="3904e5e7d11734549993f00f1c2d41f09aeca43e07215d64964088651af5c1e3" Apr 16 22:31:57.516260 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.516209 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f"] Apr 16 22:31:57.519424 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.519398 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-d1ab0-predictor-9c4cc65d7-qxz8f"] Apr 16 22:31:57.519782 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.519762 2571 scope.go:117] "RemoveContainer" containerID="49214d7b6856257318afa5fa68fe0df16046c427a6ae7f3e20804fb12081922f" Apr 16 22:31:57.520100 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:31:57.520081 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49214d7b6856257318afa5fa68fe0df16046c427a6ae7f3e20804fb12081922f\": container with ID starting with 49214d7b6856257318afa5fa68fe0df16046c427a6ae7f3e20804fb12081922f not found: ID does not exist" containerID="49214d7b6856257318afa5fa68fe0df16046c427a6ae7f3e20804fb12081922f" Apr 16 22:31:57.520148 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.520110 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49214d7b6856257318afa5fa68fe0df16046c427a6ae7f3e20804fb12081922f"} err="failed to get container status \"49214d7b6856257318afa5fa68fe0df16046c427a6ae7f3e20804fb12081922f\": rpc error: code = NotFound desc = could not find container \"49214d7b6856257318afa5fa68fe0df16046c427a6ae7f3e20804fb12081922f\": container with ID starting with 49214d7b6856257318afa5fa68fe0df16046c427a6ae7f3e20804fb12081922f not found: ID does not exist" Apr 16 22:31:57.520148 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.520129 2571 scope.go:117] "RemoveContainer" containerID="f75b80744a27e46a6bf90f8d2d65c4e296729e1fe8e00bc1adb0c16a71c8da71" Apr 16 22:31:57.520386 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:31:57.520368 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f75b80744a27e46a6bf90f8d2d65c4e296729e1fe8e00bc1adb0c16a71c8da71\": container with ID starting with f75b80744a27e46a6bf90f8d2d65c4e296729e1fe8e00bc1adb0c16a71c8da71 not found: ID does not exist" containerID="f75b80744a27e46a6bf90f8d2d65c4e296729e1fe8e00bc1adb0c16a71c8da71" Apr 16 22:31:57.520433 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.520392 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75b80744a27e46a6bf90f8d2d65c4e296729e1fe8e00bc1adb0c16a71c8da71"} err="failed to get container status \"f75b80744a27e46a6bf90f8d2d65c4e296729e1fe8e00bc1adb0c16a71c8da71\": rpc error: code = NotFound desc = could not find container \"f75b80744a27e46a6bf90f8d2d65c4e296729e1fe8e00bc1adb0c16a71c8da71\": container with ID starting with f75b80744a27e46a6bf90f8d2d65c4e296729e1fe8e00bc1adb0c16a71c8da71 not found: ID does not exist" Apr 16 22:31:57.520433 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.520409 2571 scope.go:117] "RemoveContainer" containerID="3904e5e7d11734549993f00f1c2d41f09aeca43e07215d64964088651af5c1e3" Apr 16 22:31:57.520660 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:31:57.520642 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3904e5e7d11734549993f00f1c2d41f09aeca43e07215d64964088651af5c1e3\": container with ID starting with 3904e5e7d11734549993f00f1c2d41f09aeca43e07215d64964088651af5c1e3 not found: ID does not exist" containerID="3904e5e7d11734549993f00f1c2d41f09aeca43e07215d64964088651af5c1e3" Apr 16 22:31:57.520748 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.520665 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3904e5e7d11734549993f00f1c2d41f09aeca43e07215d64964088651af5c1e3"} err="failed to get container status \"3904e5e7d11734549993f00f1c2d41f09aeca43e07215d64964088651af5c1e3\": rpc error: code = NotFound desc = could not find container \"3904e5e7d11734549993f00f1c2d41f09aeca43e07215d64964088651af5c1e3\": container with ID starting with 3904e5e7d11734549993f00f1c2d41f09aeca43e07215d64964088651af5c1e3 not found: ID does not exist" Apr 16 22:31:57.679764 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:57.679723 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" path="/var/lib/kubelet/pods/20aaa506-0e1b-4919-81cc-ff53ebe71f48/volumes" Apr 16 22:31:58.481004 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:58.480973 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:31:58.481601 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:31:58.481574 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 22:32:08.482248 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:32:08.482204 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 22:32:18.482416 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:32:18.482374 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 22:32:28.482295 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:32:28.482251 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 22:32:38.481945 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:32:38.481900 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 22:32:48.481666 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:32:48.481625 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 22:32:58.482465 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:32:58.482386 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:33:06.164926 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.164891 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5"] Apr 16 22:33:06.165388 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.165327 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="storage-initializer" Apr 16 22:33:06.165388 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.165343 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="storage-initializer" Apr 16 22:33:06.165388 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.165356 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kube-rbac-proxy" Apr 16 22:33:06.165388 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.165364 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kube-rbac-proxy" Apr 16 22:33:06.165388 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.165374 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" Apr 16 22:33:06.165388 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.165379 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" Apr 16 22:33:06.165729 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.165441 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kserve-container" Apr 16 22:33:06.165729 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.165454 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="20aaa506-0e1b-4919-81cc-ff53ebe71f48" containerName="kube-rbac-proxy" Apr 16 22:33:06.168812 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.168791 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.172044 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.171932 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 22:33:06.172218 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.172196 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-0da70d-kube-rbac-proxy-sar-config\"" Apr 16 22:33:06.172332 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.172312 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-0da70d-predictor-serving-cert\"" Apr 16 22:33:06.172433 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.172417 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-0da70d-dockercfg-tk4f7\"" Apr 16 22:33:06.172912 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.172895 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-0da70d\"" Apr 16 22:33:06.181533 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.181506 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5"] Apr 16 22:33:06.323421 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.323379 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6748ee0c-9307-49bd-8249-81eed9674e2a-cabundle-cert\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.323647 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.323440 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj9bj\" (UniqueName: \"kubernetes.io/projected/6748ee0c-9307-49bd-8249-81eed9674e2a-kube-api-access-kj9bj\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.323647 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.323487 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6748ee0c-9307-49bd-8249-81eed9674e2a-proxy-tls\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.323647 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.323512 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-0da70d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6748ee0c-9307-49bd-8249-81eed9674e2a-isvc-secondary-0da70d-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.323647 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.323631 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6748ee0c-9307-49bd-8249-81eed9674e2a-kserve-provision-location\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.424149 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.424054 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6748ee0c-9307-49bd-8249-81eed9674e2a-cabundle-cert\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.424149 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.424107 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kj9bj\" (UniqueName: \"kubernetes.io/projected/6748ee0c-9307-49bd-8249-81eed9674e2a-kube-api-access-kj9bj\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.424149 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.424141 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6748ee0c-9307-49bd-8249-81eed9674e2a-proxy-tls\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.424423 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.424172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-0da70d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6748ee0c-9307-49bd-8249-81eed9674e2a-isvc-secondary-0da70d-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.424423 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:06.424294 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-serving-cert: secret "isvc-secondary-0da70d-predictor-serving-cert" not found Apr 16 22:33:06.424423 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:06.424373 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6748ee0c-9307-49bd-8249-81eed9674e2a-proxy-tls podName:6748ee0c-9307-49bd-8249-81eed9674e2a nodeName:}" failed. No retries permitted until 2026-04-16 22:33:06.924351381 +0000 UTC m=+1169.855676226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6748ee0c-9307-49bd-8249-81eed9674e2a-proxy-tls") pod "isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" (UID: "6748ee0c-9307-49bd-8249-81eed9674e2a") : secret "isvc-secondary-0da70d-predictor-serving-cert" not found Apr 16 22:33:06.424609 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.424448 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6748ee0c-9307-49bd-8249-81eed9674e2a-kserve-provision-location\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.424814 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.424794 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6748ee0c-9307-49bd-8249-81eed9674e2a-kserve-provision-location\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.424952 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.424931 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6748ee0c-9307-49bd-8249-81eed9674e2a-cabundle-cert\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.424990 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.424931 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-0da70d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6748ee0c-9307-49bd-8249-81eed9674e2a-isvc-secondary-0da70d-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.432169 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.432141 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj9bj\" (UniqueName: \"kubernetes.io/projected/6748ee0c-9307-49bd-8249-81eed9674e2a-kube-api-access-kj9bj\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.929722 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.929673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6748ee0c-9307-49bd-8249-81eed9674e2a-proxy-tls\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:06.932252 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:06.932229 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6748ee0c-9307-49bd-8249-81eed9674e2a-proxy-tls\") pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:07.079465 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:07.079430 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:07.211726 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:07.211702 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5"] Apr 16 22:33:07.214130 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:33:07.214091 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6748ee0c_9307_49bd_8249_81eed9674e2a.slice/crio-ad3a9adf1dda9cd54bf125a12bf7c7241ef3898bddecc23d4e33b03e93cc204f WatchSource:0}: Error finding container ad3a9adf1dda9cd54bf125a12bf7c7241ef3898bddecc23d4e33b03e93cc204f: Status 404 returned error can't find the container with id ad3a9adf1dda9cd54bf125a12bf7c7241ef3898bddecc23d4e33b03e93cc204f Apr 16 22:33:07.732913 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:07.732875 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" event={"ID":"6748ee0c-9307-49bd-8249-81eed9674e2a","Type":"ContainerStarted","Data":"d12c5d23d6c9b850f71b5f52f8286341334c250b6e9982d21e21005c263ff041"} Apr 16 22:33:07.732913 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:07.732914 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" event={"ID":"6748ee0c-9307-49bd-8249-81eed9674e2a","Type":"ContainerStarted","Data":"ad3a9adf1dda9cd54bf125a12bf7c7241ef3898bddecc23d4e33b03e93cc204f"} Apr 16 22:33:12.750875 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:12.750847 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5_6748ee0c-9307-49bd-8249-81eed9674e2a/storage-initializer/0.log" Apr 16 22:33:12.751254 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:12.750895 2571 generic.go:358] "Generic (PLEG): container finished" podID="6748ee0c-9307-49bd-8249-81eed9674e2a" containerID="d12c5d23d6c9b850f71b5f52f8286341334c250b6e9982d21e21005c263ff041" exitCode=1 Apr 16 22:33:12.751254 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:12.750934 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" event={"ID":"6748ee0c-9307-49bd-8249-81eed9674e2a","Type":"ContainerDied","Data":"d12c5d23d6c9b850f71b5f52f8286341334c250b6e9982d21e21005c263ff041"} Apr 16 22:33:13.755648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:13.755619 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5_6748ee0c-9307-49bd-8249-81eed9674e2a/storage-initializer/0.log" Apr 16 22:33:13.756039 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:13.755699 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" event={"ID":"6748ee0c-9307-49bd-8249-81eed9674e2a","Type":"ContainerStarted","Data":"f3c796830097d31b2f148bb78d68d65cbcbcbf21aafe21f76012ee62febdf15e"} Apr 16 22:33:19.779896 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:19.779867 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5_6748ee0c-9307-49bd-8249-81eed9674e2a/storage-initializer/1.log" Apr 16 22:33:19.780340 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:19.780225 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5_6748ee0c-9307-49bd-8249-81eed9674e2a/storage-initializer/0.log" Apr 16 22:33:19.780340 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:19.780260 2571 generic.go:358] "Generic (PLEG): container finished" podID="6748ee0c-9307-49bd-8249-81eed9674e2a" containerID="f3c796830097d31b2f148bb78d68d65cbcbcbf21aafe21f76012ee62febdf15e" exitCode=1 Apr 16 22:33:19.780340 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:19.780331 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" event={"ID":"6748ee0c-9307-49bd-8249-81eed9674e2a","Type":"ContainerDied","Data":"f3c796830097d31b2f148bb78d68d65cbcbcbf21aafe21f76012ee62febdf15e"} Apr 16 22:33:19.780453 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:19.780372 2571 scope.go:117] "RemoveContainer" containerID="d12c5d23d6c9b850f71b5f52f8286341334c250b6e9982d21e21005c263ff041" Apr 16 22:33:19.780843 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:19.780820 2571 scope.go:117] "RemoveContainer" containerID="d12c5d23d6c9b850f71b5f52f8286341334c250b6e9982d21e21005c263ff041" Apr 16 22:33:19.791499 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:19.791470 2571 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5_kserve-ci-e2e-test_6748ee0c-9307-49bd-8249-81eed9674e2a_0 in pod sandbox ad3a9adf1dda9cd54bf125a12bf7c7241ef3898bddecc23d4e33b03e93cc204f from index: no such id: 'd12c5d23d6c9b850f71b5f52f8286341334c250b6e9982d21e21005c263ff041'" containerID="d12c5d23d6c9b850f71b5f52f8286341334c250b6e9982d21e21005c263ff041" Apr 16 22:33:19.791607 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:19.791525 2571 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5_kserve-ci-e2e-test_6748ee0c-9307-49bd-8249-81eed9674e2a_0 in pod sandbox ad3a9adf1dda9cd54bf125a12bf7c7241ef3898bddecc23d4e33b03e93cc204f from index: no such id: 'd12c5d23d6c9b850f71b5f52f8286341334c250b6e9982d21e21005c263ff041'; Skipping pod \"isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5_kserve-ci-e2e-test(6748ee0c-9307-49bd-8249-81eed9674e2a)\"" logger="UnhandledError" Apr 16 22:33:19.792934 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:19.792913 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5_kserve-ci-e2e-test(6748ee0c-9307-49bd-8249-81eed9674e2a)\"" pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" podUID="6748ee0c-9307-49bd-8249-81eed9674e2a" Apr 16 22:33:20.785250 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:20.785220 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5_6748ee0c-9307-49bd-8249-81eed9674e2a/storage-initializer/1.log" Apr 16 22:33:24.250390 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.250350 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5"] Apr 16 22:33:24.306435 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.306391 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s"] Apr 16 22:33:24.306890 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.306801 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kserve-container" containerID="cri-o://7b2bc1e178bde512ee63f6eadeb98b8610ff6fdad59399775605610351663ecb" gracePeriod=30 Apr 16 22:33:24.306890 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.306856 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kube-rbac-proxy" containerID="cri-o://d9c8e8eade7d15dce195ed8864c630166c6f5c95f222140c6c1e9ad91b6e68d0" gracePeriod=30 Apr 16 22:33:24.429215 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.429182 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4"] Apr 16 22:33:24.430361 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.430342 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5_6748ee0c-9307-49bd-8249-81eed9674e2a/storage-initializer/1.log" Apr 16 22:33:24.430468 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.430404 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:24.433580 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.433536 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.436005 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.435971 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-a0bd10\"" Apr 16 22:33:24.436102 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.436055 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-a0bd10-predictor-serving-cert\"" Apr 16 22:33:24.436102 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.436095 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config\"" Apr 16 22:33:24.436201 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.436162 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-a0bd10-dockercfg-hpqtv\"" Apr 16 22:33:24.440523 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.440496 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4"] Apr 16 22:33:24.489473 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.489442 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6748ee0c-9307-49bd-8249-81eed9674e2a-cabundle-cert\") pod \"6748ee0c-9307-49bd-8249-81eed9674e2a\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " Apr 16 22:33:24.489700 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.489600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/949ab304-d3c9-418c-a99f-e2fd1302a0e5-kserve-provision-location\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.489700 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.489665 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/949ab304-d3c9-418c-a99f-e2fd1302a0e5-cabundle-cert\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.489796 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.489718 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8dd5\" (UniqueName: \"kubernetes.io/projected/949ab304-d3c9-418c-a99f-e2fd1302a0e5-kube-api-access-j8dd5\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.489796 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.489782 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/949ab304-d3c9-418c-a99f-e2fd1302a0e5-proxy-tls\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.489931 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.489811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/949ab304-d3c9-418c-a99f-e2fd1302a0e5-isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.489974 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.489940 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6748ee0c-9307-49bd-8249-81eed9674e2a-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "6748ee0c-9307-49bd-8249-81eed9674e2a" (UID: "6748ee0c-9307-49bd-8249-81eed9674e2a"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:33:24.590783 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.590679 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-0da70d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6748ee0c-9307-49bd-8249-81eed9674e2a-isvc-secondary-0da70d-kube-rbac-proxy-sar-config\") pod \"6748ee0c-9307-49bd-8249-81eed9674e2a\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " Apr 16 22:33:24.590783 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.590765 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6748ee0c-9307-49bd-8249-81eed9674e2a-kserve-provision-location\") pod \"6748ee0c-9307-49bd-8249-81eed9674e2a\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " Apr 16 22:33:24.590994 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.590814 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj9bj\" (UniqueName: \"kubernetes.io/projected/6748ee0c-9307-49bd-8249-81eed9674e2a-kube-api-access-kj9bj\") pod \"6748ee0c-9307-49bd-8249-81eed9674e2a\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " Apr 16 22:33:24.590994 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.590868 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6748ee0c-9307-49bd-8249-81eed9674e2a-proxy-tls\") pod \"6748ee0c-9307-49bd-8249-81eed9674e2a\" (UID: \"6748ee0c-9307-49bd-8249-81eed9674e2a\") " Apr 16 22:33:24.590994 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.590948 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/949ab304-d3c9-418c-a99f-e2fd1302a0e5-kserve-provision-location\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.591177 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.591006 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/949ab304-d3c9-418c-a99f-e2fd1302a0e5-cabundle-cert\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.591177 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.591006 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6748ee0c-9307-49bd-8249-81eed9674e2a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6748ee0c-9307-49bd-8249-81eed9674e2a" (UID: "6748ee0c-9307-49bd-8249-81eed9674e2a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:33:24.591177 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.591075 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8dd5\" (UniqueName: \"kubernetes.io/projected/949ab304-d3c9-418c-a99f-e2fd1302a0e5-kube-api-access-j8dd5\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.591177 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.591108 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6748ee0c-9307-49bd-8249-81eed9674e2a-isvc-secondary-0da70d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-0da70d-kube-rbac-proxy-sar-config") pod "6748ee0c-9307-49bd-8249-81eed9674e2a" (UID: "6748ee0c-9307-49bd-8249-81eed9674e2a"). InnerVolumeSpecName "isvc-secondary-0da70d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:33:24.591177 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.591120 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/949ab304-d3c9-418c-a99f-e2fd1302a0e5-proxy-tls\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.591177 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.591173 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/949ab304-d3c9-418c-a99f-e2fd1302a0e5-isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.591495 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:24.591212 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-serving-cert: secret "isvc-init-fail-a0bd10-predictor-serving-cert" not found Apr 16 22:33:24.591495 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:24.591269 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/949ab304-d3c9-418c-a99f-e2fd1302a0e5-proxy-tls podName:949ab304-d3c9-418c-a99f-e2fd1302a0e5 nodeName:}" failed. No retries permitted until 2026-04-16 22:33:25.091249765 +0000 UTC m=+1188.022574607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/949ab304-d3c9-418c-a99f-e2fd1302a0e5-proxy-tls") pod "isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" (UID: "949ab304-d3c9-418c-a99f-e2fd1302a0e5") : secret "isvc-init-fail-a0bd10-predictor-serving-cert" not found Apr 16 22:33:24.591495 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.591267 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-0da70d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6748ee0c-9307-49bd-8249-81eed9674e2a-isvc-secondary-0da70d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:24.591495 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.591291 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6748ee0c-9307-49bd-8249-81eed9674e2a-kserve-provision-location\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:24.591495 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.591335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/949ab304-d3c9-418c-a99f-e2fd1302a0e5-kserve-provision-location\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.591495 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.591373 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6748ee0c-9307-49bd-8249-81eed9674e2a-cabundle-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:24.591989 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.591962 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/949ab304-d3c9-418c-a99f-e2fd1302a0e5-isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.592086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.592058 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/949ab304-d3c9-418c-a99f-e2fd1302a0e5-cabundle-cert\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.593058 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.593034 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6748ee0c-9307-49bd-8249-81eed9674e2a-kube-api-access-kj9bj" (OuterVolumeSpecName: "kube-api-access-kj9bj") pod "6748ee0c-9307-49bd-8249-81eed9674e2a" (UID: "6748ee0c-9307-49bd-8249-81eed9674e2a"). InnerVolumeSpecName "kube-api-access-kj9bj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:33:24.593214 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.593197 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6748ee0c-9307-49bd-8249-81eed9674e2a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6748ee0c-9307-49bd-8249-81eed9674e2a" (UID: "6748ee0c-9307-49bd-8249-81eed9674e2a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:33:24.599876 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.599846 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8dd5\" (UniqueName: \"kubernetes.io/projected/949ab304-d3c9-418c-a99f-e2fd1302a0e5-kube-api-access-j8dd5\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:24.692021 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.691971 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kj9bj\" (UniqueName: \"kubernetes.io/projected/6748ee0c-9307-49bd-8249-81eed9674e2a-kube-api-access-kj9bj\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:24.692021 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.692024 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6748ee0c-9307-49bd-8249-81eed9674e2a-proxy-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:24.802045 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.802016 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5_6748ee0c-9307-49bd-8249-81eed9674e2a/storage-initializer/1.log" Apr 16 22:33:24.802221 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.802147 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" Apr 16 22:33:24.802221 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.802159 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5" event={"ID":"6748ee0c-9307-49bd-8249-81eed9674e2a","Type":"ContainerDied","Data":"ad3a9adf1dda9cd54bf125a12bf7c7241ef3898bddecc23d4e33b03e93cc204f"} Apr 16 22:33:24.802221 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.802200 2571 scope.go:117] "RemoveContainer" containerID="f3c796830097d31b2f148bb78d68d65cbcbcbf21aafe21f76012ee62febdf15e" Apr 16 22:33:24.804517 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.804486 2571 generic.go:358] "Generic (PLEG): container finished" podID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerID="d9c8e8eade7d15dce195ed8864c630166c6f5c95f222140c6c1e9ad91b6e68d0" exitCode=2 Apr 16 22:33:24.804648 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.804538 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" event={"ID":"c142c901-c0c6-429b-8b01-e6776240ae4a","Type":"ContainerDied","Data":"d9c8e8eade7d15dce195ed8864c630166c6f5c95f222140c6c1e9ad91b6e68d0"} Apr 16 22:33:24.838926 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.838835 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5"] Apr 16 22:33:24.841076 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:24.841003 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0da70d-predictor-5d65bb88c5-shvw5"] Apr 16 22:33:25.096305 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:25.096205 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/949ab304-d3c9-418c-a99f-e2fd1302a0e5-proxy-tls\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:25.098691 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:25.098672 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/949ab304-d3c9-418c-a99f-e2fd1302a0e5-proxy-tls\") pod \"isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:25.346502 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:25.346382 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:25.490344 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:25.490255 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4"] Apr 16 22:33:25.494072 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:33:25.494039 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod949ab304_d3c9_418c_a99f_e2fd1302a0e5.slice/crio-97b196055e5eedda2c9cf44398f9b65c81321de5eef573da382ca26a88e4f6ec WatchSource:0}: Error finding container 97b196055e5eedda2c9cf44398f9b65c81321de5eef573da382ca26a88e4f6ec: Status 404 returned error can't find the container with id 97b196055e5eedda2c9cf44398f9b65c81321de5eef573da382ca26a88e4f6ec Apr 16 22:33:25.681273 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:25.681169 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6748ee0c-9307-49bd-8249-81eed9674e2a" path="/var/lib/kubelet/pods/6748ee0c-9307-49bd-8249-81eed9674e2a/volumes" Apr 16 22:33:25.809347 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:25.809311 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" event={"ID":"949ab304-d3c9-418c-a99f-e2fd1302a0e5","Type":"ContainerStarted","Data":"90702a041cb77efd802f3f31618b344b478e434fd45f32e969a54474bd4a165d"} Apr 16 22:33:25.809347 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:25.809352 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" event={"ID":"949ab304-d3c9-418c-a99f-e2fd1302a0e5","Type":"ContainerStarted","Data":"97b196055e5eedda2c9cf44398f9b65c81321de5eef573da382ca26a88e4f6ec"} Apr 16 22:33:28.477566 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:28.477518 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 16 22:33:28.481866 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:28.481831 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 22:33:28.820903 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:28.820870 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4_949ab304-d3c9-418c-a99f-e2fd1302a0e5/storage-initializer/0.log" Apr 16 22:33:28.821060 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:28.820914 2571 generic.go:358] "Generic (PLEG): container finished" podID="949ab304-d3c9-418c-a99f-e2fd1302a0e5" containerID="90702a041cb77efd802f3f31618b344b478e434fd45f32e969a54474bd4a165d" exitCode=1 Apr 16 22:33:28.821060 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:28.821006 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" event={"ID":"949ab304-d3c9-418c-a99f-e2fd1302a0e5","Type":"ContainerDied","Data":"90702a041cb77efd802f3f31618b344b478e434fd45f32e969a54474bd4a165d"} Apr 16 22:33:29.051013 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.050950 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:33:29.132916 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.132872 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c142c901-c0c6-429b-8b01-e6776240ae4a-kserve-provision-location\") pod \"c142c901-c0c6-429b-8b01-e6776240ae4a\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " Apr 16 22:33:29.133106 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.132947 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-0da70d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c142c901-c0c6-429b-8b01-e6776240ae4a-isvc-primary-0da70d-kube-rbac-proxy-sar-config\") pod \"c142c901-c0c6-429b-8b01-e6776240ae4a\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " Apr 16 22:33:29.133106 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.132985 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g94r\" (UniqueName: \"kubernetes.io/projected/c142c901-c0c6-429b-8b01-e6776240ae4a-kube-api-access-2g94r\") pod \"c142c901-c0c6-429b-8b01-e6776240ae4a\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " Apr 16 22:33:29.133106 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.133054 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c142c901-c0c6-429b-8b01-e6776240ae4a-proxy-tls\") pod \"c142c901-c0c6-429b-8b01-e6776240ae4a\" (UID: \"c142c901-c0c6-429b-8b01-e6776240ae4a\") " Apr 16 22:33:29.133280 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.133249 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c142c901-c0c6-429b-8b01-e6776240ae4a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c142c901-c0c6-429b-8b01-e6776240ae4a" (UID: "c142c901-c0c6-429b-8b01-e6776240ae4a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:33:29.133338 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.133301 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c142c901-c0c6-429b-8b01-e6776240ae4a-isvc-primary-0da70d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-0da70d-kube-rbac-proxy-sar-config") pod "c142c901-c0c6-429b-8b01-e6776240ae4a" (UID: "c142c901-c0c6-429b-8b01-e6776240ae4a"). InnerVolumeSpecName "isvc-primary-0da70d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:33:29.133417 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.133390 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c142c901-c0c6-429b-8b01-e6776240ae4a-kserve-provision-location\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:29.133417 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.133416 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-0da70d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c142c901-c0c6-429b-8b01-e6776240ae4a-isvc-primary-0da70d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:29.135211 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.135178 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c142c901-c0c6-429b-8b01-e6776240ae4a-kube-api-access-2g94r" (OuterVolumeSpecName: "kube-api-access-2g94r") pod "c142c901-c0c6-429b-8b01-e6776240ae4a" (UID: "c142c901-c0c6-429b-8b01-e6776240ae4a"). InnerVolumeSpecName "kube-api-access-2g94r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:33:29.135329 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.135247 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c142c901-c0c6-429b-8b01-e6776240ae4a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c142c901-c0c6-429b-8b01-e6776240ae4a" (UID: "c142c901-c0c6-429b-8b01-e6776240ae4a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:33:29.234655 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.234613 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2g94r\" (UniqueName: \"kubernetes.io/projected/c142c901-c0c6-429b-8b01-e6776240ae4a-kube-api-access-2g94r\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:29.234655 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.234647 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c142c901-c0c6-429b-8b01-e6776240ae4a-proxy-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:29.360572 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.360459 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4"] Apr 16 22:33:29.491600 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.491545 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z"] Apr 16 22:33:29.492062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.491934 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kserve-container" Apr 16 22:33:29.492062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.491945 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kserve-container" Apr 16 22:33:29.492062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.491953 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="storage-initializer" Apr 16 22:33:29.492062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.491960 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="storage-initializer" Apr 16 22:33:29.492062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.491981 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6748ee0c-9307-49bd-8249-81eed9674e2a" containerName="storage-initializer" Apr 16 22:33:29.492062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.491988 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6748ee0c-9307-49bd-8249-81eed9674e2a" containerName="storage-initializer" Apr 16 22:33:29.492062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.491996 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kube-rbac-proxy" Apr 16 22:33:29.492062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.492002 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kube-rbac-proxy" Apr 16 22:33:29.492062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.492013 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6748ee0c-9307-49bd-8249-81eed9674e2a" containerName="storage-initializer" Apr 16 22:33:29.492062 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.492018 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6748ee0c-9307-49bd-8249-81eed9674e2a" containerName="storage-initializer" Apr 16 22:33:29.492576 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.492070 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kserve-container" Apr 16 22:33:29.492576 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.492080 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerName="kube-rbac-proxy" Apr 16 22:33:29.492576 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.492087 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6748ee0c-9307-49bd-8249-81eed9674e2a" containerName="storage-initializer" Apr 16 22:33:29.492576 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.492094 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6748ee0c-9307-49bd-8249-81eed9674e2a" containerName="storage-initializer" Apr 16 22:33:29.496909 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.496884 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:29.499404 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.499381 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-88982-predictor-serving-cert\"" Apr 16 22:33:29.499526 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.499428 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-88982-kube-rbac-proxy-sar-config\"" Apr 16 22:33:29.504409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.504384 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z"] Apr 16 22:33:29.537819 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.537782 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-88982-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-raw-sklearn-88982-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-88982-predictor-5576d8bccc-6bw9z\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:29.537993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.537829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-kserve-provision-location\") pod \"raw-sklearn-88982-predictor-5576d8bccc-6bw9z\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:29.537993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.537893 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnxmg\" (UniqueName: \"kubernetes.io/projected/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-kube-api-access-pnxmg\") pod \"raw-sklearn-88982-predictor-5576d8bccc-6bw9z\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:29.537993 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.537966 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-proxy-tls\") pod \"raw-sklearn-88982-predictor-5576d8bccc-6bw9z\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:29.639041 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.638933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-88982-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-raw-sklearn-88982-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-88982-predictor-5576d8bccc-6bw9z\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:29.639041 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.638976 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-kserve-provision-location\") pod \"raw-sklearn-88982-predictor-5576d8bccc-6bw9z\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:29.639041 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.639016 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnxmg\" (UniqueName: \"kubernetes.io/projected/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-kube-api-access-pnxmg\") pod \"raw-sklearn-88982-predictor-5576d8bccc-6bw9z\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:29.639353 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.639059 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-proxy-tls\") pod \"raw-sklearn-88982-predictor-5576d8bccc-6bw9z\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:29.639353 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:29.639192 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/raw-sklearn-88982-predictor-serving-cert: secret "raw-sklearn-88982-predictor-serving-cert" not found Apr 16 22:33:29.639353 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:29.639261 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-proxy-tls podName:59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3 nodeName:}" failed. No retries permitted until 2026-04-16 22:33:30.139238348 +0000 UTC m=+1193.070563188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-proxy-tls") pod "raw-sklearn-88982-predictor-5576d8bccc-6bw9z" (UID: "59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3") : secret "raw-sklearn-88982-predictor-serving-cert" not found Apr 16 22:33:29.639522 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.639495 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-kserve-provision-location\") pod \"raw-sklearn-88982-predictor-5576d8bccc-6bw9z\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:29.639780 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.639761 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-88982-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-raw-sklearn-88982-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-88982-predictor-5576d8bccc-6bw9z\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:29.649989 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.649961 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnxmg\" (UniqueName: \"kubernetes.io/projected/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-kube-api-access-pnxmg\") pod \"raw-sklearn-88982-predictor-5576d8bccc-6bw9z\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:29.826040 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.826008 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4_949ab304-d3c9-418c-a99f-e2fd1302a0e5/storage-initializer/0.log" Apr 16 22:33:29.826227 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.826110 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" event={"ID":"949ab304-d3c9-418c-a99f-e2fd1302a0e5","Type":"ContainerStarted","Data":"277011d22af859257a9d79dda93d4dd2f0515fdcec420f57066e560d470e4e75"} Apr 16 22:33:29.826295 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.826248 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" podUID="949ab304-d3c9-418c-a99f-e2fd1302a0e5" containerName="storage-initializer" containerID="cri-o://277011d22af859257a9d79dda93d4dd2f0515fdcec420f57066e560d470e4e75" gracePeriod=30 Apr 16 22:33:29.828115 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.828084 2571 generic.go:358] "Generic (PLEG): container finished" podID="c142c901-c0c6-429b-8b01-e6776240ae4a" containerID="7b2bc1e178bde512ee63f6eadeb98b8610ff6fdad59399775605610351663ecb" exitCode=0 Apr 16 22:33:29.828247 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.828166 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" event={"ID":"c142c901-c0c6-429b-8b01-e6776240ae4a","Type":"ContainerDied","Data":"7b2bc1e178bde512ee63f6eadeb98b8610ff6fdad59399775605610351663ecb"} Apr 16 22:33:29.828247 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.828191 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" Apr 16 22:33:29.828247 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.828202 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s" event={"ID":"c142c901-c0c6-429b-8b01-e6776240ae4a","Type":"ContainerDied","Data":"32a352dd1c4dcb952b2b494497444d97720fe4201747993a004eb4ee6b820997"} Apr 16 22:33:29.828247 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.828223 2571 scope.go:117] "RemoveContainer" containerID="d9c8e8eade7d15dce195ed8864c630166c6f5c95f222140c6c1e9ad91b6e68d0" Apr 16 22:33:29.837367 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.837343 2571 scope.go:117] "RemoveContainer" containerID="7b2bc1e178bde512ee63f6eadeb98b8610ff6fdad59399775605610351663ecb" Apr 16 22:33:29.846143 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.846120 2571 scope.go:117] "RemoveContainer" containerID="027c6e80b6a27609b2355536eb40bae8e7d297de9aba4812810103ee057695fd" Apr 16 22:33:29.854436 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.854411 2571 scope.go:117] "RemoveContainer" containerID="d9c8e8eade7d15dce195ed8864c630166c6f5c95f222140c6c1e9ad91b6e68d0" Apr 16 22:33:29.854881 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:29.854795 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c8e8eade7d15dce195ed8864c630166c6f5c95f222140c6c1e9ad91b6e68d0\": container with ID starting with d9c8e8eade7d15dce195ed8864c630166c6f5c95f222140c6c1e9ad91b6e68d0 not found: ID does not exist" containerID="d9c8e8eade7d15dce195ed8864c630166c6f5c95f222140c6c1e9ad91b6e68d0" Apr 16 22:33:29.854881 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.854834 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c8e8eade7d15dce195ed8864c630166c6f5c95f222140c6c1e9ad91b6e68d0"} err="failed to get container status \"d9c8e8eade7d15dce195ed8864c630166c6f5c95f222140c6c1e9ad91b6e68d0\": rpc error: code = NotFound desc = could not find container \"d9c8e8eade7d15dce195ed8864c630166c6f5c95f222140c6c1e9ad91b6e68d0\": container with ID starting with d9c8e8eade7d15dce195ed8864c630166c6f5c95f222140c6c1e9ad91b6e68d0 not found: ID does not exist" Apr 16 22:33:29.854881 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.854861 2571 scope.go:117] "RemoveContainer" containerID="7b2bc1e178bde512ee63f6eadeb98b8610ff6fdad59399775605610351663ecb" Apr 16 22:33:29.855263 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:29.855149 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2bc1e178bde512ee63f6eadeb98b8610ff6fdad59399775605610351663ecb\": container with ID starting with 7b2bc1e178bde512ee63f6eadeb98b8610ff6fdad59399775605610351663ecb not found: ID does not exist" containerID="7b2bc1e178bde512ee63f6eadeb98b8610ff6fdad59399775605610351663ecb" Apr 16 22:33:29.855263 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.855183 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2bc1e178bde512ee63f6eadeb98b8610ff6fdad59399775605610351663ecb"} err="failed to get container status \"7b2bc1e178bde512ee63f6eadeb98b8610ff6fdad59399775605610351663ecb\": rpc error: code = NotFound desc = could not find container \"7b2bc1e178bde512ee63f6eadeb98b8610ff6fdad59399775605610351663ecb\": container with ID starting with 7b2bc1e178bde512ee63f6eadeb98b8610ff6fdad59399775605610351663ecb not found: ID does not exist" Apr 16 22:33:29.855263 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.855205 2571 scope.go:117] "RemoveContainer" containerID="027c6e80b6a27609b2355536eb40bae8e7d297de9aba4812810103ee057695fd" Apr 16 22:33:29.855485 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:29.855464 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"027c6e80b6a27609b2355536eb40bae8e7d297de9aba4812810103ee057695fd\": container with ID starting with 027c6e80b6a27609b2355536eb40bae8e7d297de9aba4812810103ee057695fd not found: ID does not exist" containerID="027c6e80b6a27609b2355536eb40bae8e7d297de9aba4812810103ee057695fd" Apr 16 22:33:29.855532 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.855492 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"027c6e80b6a27609b2355536eb40bae8e7d297de9aba4812810103ee057695fd"} err="failed to get container status \"027c6e80b6a27609b2355536eb40bae8e7d297de9aba4812810103ee057695fd\": rpc error: code = NotFound desc = could not find container \"027c6e80b6a27609b2355536eb40bae8e7d297de9aba4812810103ee057695fd\": container with ID starting with 027c6e80b6a27609b2355536eb40bae8e7d297de9aba4812810103ee057695fd not found: ID does not exist" Apr 16 22:33:29.857366 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.857344 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s"] Apr 16 22:33:29.861128 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:29.861105 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0da70d-predictor-865f5b4c4d-rfd2s"] Apr 16 22:33:30.142833 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:30.142782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-proxy-tls\") pod \"raw-sklearn-88982-predictor-5576d8bccc-6bw9z\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:30.145333 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:30.145308 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-proxy-tls\") pod \"raw-sklearn-88982-predictor-5576d8bccc-6bw9z\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:30.411611 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:30.411489 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:30.538367 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:30.538332 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z"] Apr 16 22:33:30.540659 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:33:30.540620 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59e9a1e4_970f_4f93_96b0_8b9e10c3a5e3.slice/crio-b44de7c82020217086dc1fc54c4e22868268488f1a3a9ff4ffbf26eb49f3f725 WatchSource:0}: Error finding container b44de7c82020217086dc1fc54c4e22868268488f1a3a9ff4ffbf26eb49f3f725: Status 404 returned error can't find the container with id b44de7c82020217086dc1fc54c4e22868268488f1a3a9ff4ffbf26eb49f3f725 Apr 16 22:33:30.835330 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:30.835289 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" event={"ID":"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3","Type":"ContainerStarted","Data":"456bf2ad0d743b07d85bd408616cb13e3b1e5a94800ea7c2ecbe94bc030aa36c"} Apr 16 22:33:30.835330 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:30.835334 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" event={"ID":"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3","Type":"ContainerStarted","Data":"b44de7c82020217086dc1fc54c4e22868268488f1a3a9ff4ffbf26eb49f3f725"} Apr 16 22:33:31.680684 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:31.680639 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c142c901-c0c6-429b-8b01-e6776240ae4a" path="/var/lib/kubelet/pods/c142c901-c0c6-429b-8b01-e6776240ae4a/volumes" Apr 16 22:33:34.080542 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.080517 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4_949ab304-d3c9-418c-a99f-e2fd1302a0e5/storage-initializer/1.log" Apr 16 22:33:34.080961 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.080888 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4_949ab304-d3c9-418c-a99f-e2fd1302a0e5/storage-initializer/0.log" Apr 16 22:33:34.080961 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.080954 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:34.178693 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.178593 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8dd5\" (UniqueName: \"kubernetes.io/projected/949ab304-d3c9-418c-a99f-e2fd1302a0e5-kube-api-access-j8dd5\") pod \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " Apr 16 22:33:34.178693 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.178652 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/949ab304-d3c9-418c-a99f-e2fd1302a0e5-isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config\") pod \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " Apr 16 22:33:34.178693 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.178681 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/949ab304-d3c9-418c-a99f-e2fd1302a0e5-kserve-provision-location\") pod \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " Apr 16 22:33:34.178981 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.178721 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/949ab304-d3c9-418c-a99f-e2fd1302a0e5-proxy-tls\") pod \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " Apr 16 22:33:34.178981 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.178755 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/949ab304-d3c9-418c-a99f-e2fd1302a0e5-cabundle-cert\") pod \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\" (UID: \"949ab304-d3c9-418c-a99f-e2fd1302a0e5\") " Apr 16 22:33:34.179088 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.178993 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/949ab304-d3c9-418c-a99f-e2fd1302a0e5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "949ab304-d3c9-418c-a99f-e2fd1302a0e5" (UID: "949ab304-d3c9-418c-a99f-e2fd1302a0e5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:33:34.179137 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.179091 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949ab304-d3c9-418c-a99f-e2fd1302a0e5-isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config") pod "949ab304-d3c9-418c-a99f-e2fd1302a0e5" (UID: "949ab304-d3c9-418c-a99f-e2fd1302a0e5"). InnerVolumeSpecName "isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:33:34.179186 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.179151 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949ab304-d3c9-418c-a99f-e2fd1302a0e5-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "949ab304-d3c9-418c-a99f-e2fd1302a0e5" (UID: "949ab304-d3c9-418c-a99f-e2fd1302a0e5"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:33:34.180944 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.180919 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949ab304-d3c9-418c-a99f-e2fd1302a0e5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "949ab304-d3c9-418c-a99f-e2fd1302a0e5" (UID: "949ab304-d3c9-418c-a99f-e2fd1302a0e5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:33:34.181049 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.180954 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949ab304-d3c9-418c-a99f-e2fd1302a0e5-kube-api-access-j8dd5" (OuterVolumeSpecName: "kube-api-access-j8dd5") pod "949ab304-d3c9-418c-a99f-e2fd1302a0e5" (UID: "949ab304-d3c9-418c-a99f-e2fd1302a0e5"). InnerVolumeSpecName "kube-api-access-j8dd5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:33:34.279948 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.279914 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/949ab304-d3c9-418c-a99f-e2fd1302a0e5-proxy-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:34.279948 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.279945 2571 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/949ab304-d3c9-418c-a99f-e2fd1302a0e5-cabundle-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:34.279948 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.279955 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j8dd5\" (UniqueName: \"kubernetes.io/projected/949ab304-d3c9-418c-a99f-e2fd1302a0e5-kube-api-access-j8dd5\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:34.280187 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.279966 2571 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/949ab304-d3c9-418c-a99f-e2fd1302a0e5-isvc-init-fail-a0bd10-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:34.280187 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.279976 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/949ab304-d3c9-418c-a99f-e2fd1302a0e5-kserve-provision-location\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:33:34.850633 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.850584 2571 generic.go:358] "Generic (PLEG): container finished" podID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerID="456bf2ad0d743b07d85bd408616cb13e3b1e5a94800ea7c2ecbe94bc030aa36c" exitCode=0 Apr 16 22:33:34.850878 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.850659 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" event={"ID":"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3","Type":"ContainerDied","Data":"456bf2ad0d743b07d85bd408616cb13e3b1e5a94800ea7c2ecbe94bc030aa36c"} Apr 16 22:33:34.851918 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.851898 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4_949ab304-d3c9-418c-a99f-e2fd1302a0e5/storage-initializer/1.log" Apr 16 22:33:34.852264 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.852250 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4_949ab304-d3c9-418c-a99f-e2fd1302a0e5/storage-initializer/0.log" Apr 16 22:33:34.852337 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.852283 2571 generic.go:358] "Generic (PLEG): container finished" podID="949ab304-d3c9-418c-a99f-e2fd1302a0e5" containerID="277011d22af859257a9d79dda93d4dd2f0515fdcec420f57066e560d470e4e75" exitCode=1 Apr 16 22:33:34.852385 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.852340 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" event={"ID":"949ab304-d3c9-418c-a99f-e2fd1302a0e5","Type":"ContainerDied","Data":"277011d22af859257a9d79dda93d4dd2f0515fdcec420f57066e560d470e4e75"} Apr 16 22:33:34.852385 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.852368 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" Apr 16 22:33:34.852457 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.852384 2571 scope.go:117] "RemoveContainer" containerID="277011d22af859257a9d79dda93d4dd2f0515fdcec420f57066e560d470e4e75" Apr 16 22:33:34.852532 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.852371 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4" event={"ID":"949ab304-d3c9-418c-a99f-e2fd1302a0e5","Type":"ContainerDied","Data":"97b196055e5eedda2c9cf44398f9b65c81321de5eef573da382ca26a88e4f6ec"} Apr 16 22:33:34.865137 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.865115 2571 scope.go:117] "RemoveContainer" containerID="90702a041cb77efd802f3f31618b344b478e434fd45f32e969a54474bd4a165d" Apr 16 22:33:34.881686 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.881656 2571 scope.go:117] "RemoveContainer" containerID="277011d22af859257a9d79dda93d4dd2f0515fdcec420f57066e560d470e4e75" Apr 16 22:33:34.882009 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:34.881990 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"277011d22af859257a9d79dda93d4dd2f0515fdcec420f57066e560d470e4e75\": container with ID starting with 277011d22af859257a9d79dda93d4dd2f0515fdcec420f57066e560d470e4e75 not found: ID does not exist" containerID="277011d22af859257a9d79dda93d4dd2f0515fdcec420f57066e560d470e4e75" Apr 16 22:33:34.882061 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.882019 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"277011d22af859257a9d79dda93d4dd2f0515fdcec420f57066e560d470e4e75"} err="failed to get container status \"277011d22af859257a9d79dda93d4dd2f0515fdcec420f57066e560d470e4e75\": rpc error: code = NotFound desc = could not find container \"277011d22af859257a9d79dda93d4dd2f0515fdcec420f57066e560d470e4e75\": container with ID starting with 277011d22af859257a9d79dda93d4dd2f0515fdcec420f57066e560d470e4e75 not found: ID does not exist" Apr 16 22:33:34.882061 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.882041 2571 scope.go:117] "RemoveContainer" containerID="90702a041cb77efd802f3f31618b344b478e434fd45f32e969a54474bd4a165d" Apr 16 22:33:34.882284 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:33:34.882265 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90702a041cb77efd802f3f31618b344b478e434fd45f32e969a54474bd4a165d\": container with ID starting with 90702a041cb77efd802f3f31618b344b478e434fd45f32e969a54474bd4a165d not found: ID does not exist" containerID="90702a041cb77efd802f3f31618b344b478e434fd45f32e969a54474bd4a165d" Apr 16 22:33:34.882329 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.882288 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90702a041cb77efd802f3f31618b344b478e434fd45f32e969a54474bd4a165d"} err="failed to get container status \"90702a041cb77efd802f3f31618b344b478e434fd45f32e969a54474bd4a165d\": rpc error: code = NotFound desc = could not find container \"90702a041cb77efd802f3f31618b344b478e434fd45f32e969a54474bd4a165d\": container with ID starting with 90702a041cb77efd802f3f31618b344b478e434fd45f32e969a54474bd4a165d not found: ID does not exist" Apr 16 22:33:34.896248 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.896211 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4"] Apr 16 22:33:34.898685 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:34.898386 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a0bd10-predictor-56b7767fbc-hpgz4"] Apr 16 22:33:35.680336 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:35.680298 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949ab304-d3c9-418c-a99f-e2fd1302a0e5" path="/var/lib/kubelet/pods/949ab304-d3c9-418c-a99f-e2fd1302a0e5/volumes" Apr 16 22:33:35.858856 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:35.858820 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" event={"ID":"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3","Type":"ContainerStarted","Data":"cb763e541b64a9f21de6e4ad2c8b77322a47c82ec74242b792b2eade8b5effc9"} Apr 16 22:33:35.858856 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:35.858858 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" event={"ID":"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3","Type":"ContainerStarted","Data":"4f8a50fc4404a459d9a46785fa5d73a7ed37a013a168422c8e0681d9946b6754"} Apr 16 22:33:35.859187 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:35.859167 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:35.859339 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:35.859305 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:35.860578 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:35.860536 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 22:33:35.876767 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:35.876723 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" podStartSLOduration=6.876706372 podStartE2EDuration="6.876706372s" podCreationTimestamp="2026-04-16 22:33:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:33:35.875589169 +0000 UTC m=+1198.806914033" watchObservedRunningTime="2026-04-16 22:33:35.876706372 +0000 UTC m=+1198.808031238" Apr 16 22:33:36.862742 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:36.862704 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 22:33:37.625211 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:37.625182 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/1.log" Apr 16 22:33:37.627690 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:37.627664 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/1.log" Apr 16 22:33:41.867819 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:41.867788 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:33:41.868438 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:41.868405 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 22:33:51.869268 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:33:51.869220 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 22:34:01.868361 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:01.868313 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 22:34:11.868344 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:11.868300 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 22:34:21.868349 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:21.868306 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 22:34:31.868895 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:31.868790 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 22:34:41.869185 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:41.869154 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:34:49.600502 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.600469 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z"] Apr 16 22:34:49.601086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.600890 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kserve-container" containerID="cri-o://4f8a50fc4404a459d9a46785fa5d73a7ed37a013a168422c8e0681d9946b6754" gracePeriod=30 Apr 16 22:34:49.601086 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.600909 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kube-rbac-proxy" containerID="cri-o://cb763e541b64a9f21de6e4ad2c8b77322a47c82ec74242b792b2eade8b5effc9" gracePeriod=30 Apr 16 22:34:49.680368 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.680336 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk"] Apr 16 22:34:49.680718 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.680705 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="949ab304-d3c9-418c-a99f-e2fd1302a0e5" containerName="storage-initializer" Apr 16 22:34:49.680788 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.680720 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="949ab304-d3c9-418c-a99f-e2fd1302a0e5" containerName="storage-initializer" Apr 16 22:34:49.680788 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.680729 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="949ab304-d3c9-418c-a99f-e2fd1302a0e5" containerName="storage-initializer" Apr 16 22:34:49.680788 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.680735 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="949ab304-d3c9-418c-a99f-e2fd1302a0e5" containerName="storage-initializer" Apr 16 22:34:49.680947 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.680803 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="949ab304-d3c9-418c-a99f-e2fd1302a0e5" containerName="storage-initializer" Apr 16 22:34:49.680947 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.680918 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="949ab304-d3c9-418c-a99f-e2fd1302a0e5" containerName="storage-initializer" Apr 16 22:34:49.684314 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.684293 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:49.686872 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.686846 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config\"" Apr 16 22:34:49.686974 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.686874 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-8cb92-predictor-serving-cert\"" Apr 16 22:34:49.693150 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.693122 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk"] Apr 16 22:34:49.834066 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.834022 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvw7x\" (UniqueName: \"kubernetes.io/projected/0a57ec34-42f2-4076-b276-b514190f0400-kube-api-access-dvw7x\") pod \"raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:49.834270 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.834108 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a57ec34-42f2-4076-b276-b514190f0400-raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:49.834270 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.834253 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a57ec34-42f2-4076-b276-b514190f0400-kserve-provision-location\") pod \"raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:49.834525 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.834500 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a57ec34-42f2-4076-b276-b514190f0400-proxy-tls\") pod \"raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:49.935275 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.935171 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvw7x\" (UniqueName: \"kubernetes.io/projected/0a57ec34-42f2-4076-b276-b514190f0400-kube-api-access-dvw7x\") pod \"raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:49.935275 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.935264 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a57ec34-42f2-4076-b276-b514190f0400-raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:49.935518 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.935312 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a57ec34-42f2-4076-b276-b514190f0400-kserve-provision-location\") pod \"raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:49.935518 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.935354 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a57ec34-42f2-4076-b276-b514190f0400-proxy-tls\") pod \"raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:49.935792 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.935770 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a57ec34-42f2-4076-b276-b514190f0400-kserve-provision-location\") pod \"raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:49.936115 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.936086 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a57ec34-42f2-4076-b276-b514190f0400-raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:49.938179 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.938149 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a57ec34-42f2-4076-b276-b514190f0400-proxy-tls\") pod \"raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:49.943003 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.942985 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvw7x\" (UniqueName: \"kubernetes.io/projected/0a57ec34-42f2-4076-b276-b514190f0400-kube-api-access-dvw7x\") pod \"raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:49.995779 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:49.995751 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:50.121211 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:50.121183 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk"] Apr 16 22:34:50.121351 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:50.121330 2571 generic.go:358] "Generic (PLEG): container finished" podID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerID="cb763e541b64a9f21de6e4ad2c8b77322a47c82ec74242b792b2eade8b5effc9" exitCode=2 Apr 16 22:34:50.121403 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:50.121392 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" event={"ID":"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3","Type":"ContainerDied","Data":"cb763e541b64a9f21de6e4ad2c8b77322a47c82ec74242b792b2eade8b5effc9"} Apr 16 22:34:50.123228 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:34:50.123204 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a57ec34_42f2_4076_b276_b514190f0400.slice/crio-904cdee7bff197f8da1f3d39973ef1e721deee5462652fd5c8f60d387803b1ae WatchSource:0}: Error finding container 904cdee7bff197f8da1f3d39973ef1e721deee5462652fd5c8f60d387803b1ae: Status 404 returned error can't find the container with id 904cdee7bff197f8da1f3d39973ef1e721deee5462652fd5c8f60d387803b1ae Apr 16 22:34:50.124814 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:50.124799 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:34:51.126292 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:51.126260 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" event={"ID":"0a57ec34-42f2-4076-b276-b514190f0400","Type":"ContainerStarted","Data":"7522cafc3670319e855126125eba7712746821755df0a4a668ff40ff95965428"} Apr 16 22:34:51.126692 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:51.126299 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" event={"ID":"0a57ec34-42f2-4076-b276-b514190f0400","Type":"ContainerStarted","Data":"904cdee7bff197f8da1f3d39973ef1e721deee5462652fd5c8f60d387803b1ae"} Apr 16 22:34:51.863613 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:51.863573 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.45:8643/healthz\": dial tcp 10.132.0.45:8643: connect: connection refused" Apr 16 22:34:51.868966 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:51.868937 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 22:34:54.349157 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:54.349131 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:34:54.476274 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:54.476186 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnxmg\" (UniqueName: \"kubernetes.io/projected/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-kube-api-access-pnxmg\") pod \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " Apr 16 22:34:54.476274 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:54.476231 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-kserve-provision-location\") pod \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " Apr 16 22:34:54.476523 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:54.476311 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-88982-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-raw-sklearn-88982-kube-rbac-proxy-sar-config\") pod \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " Apr 16 22:34:54.476523 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:54.476342 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-proxy-tls\") pod \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\" (UID: \"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3\") " Apr 16 22:34:54.476684 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:54.476658 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" (UID: "59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:34:54.476733 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:54.476707 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-raw-sklearn-88982-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-88982-kube-rbac-proxy-sar-config") pod "59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" (UID: "59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3"). InnerVolumeSpecName "raw-sklearn-88982-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:34:54.478378 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:54.478358 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" (UID: "59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:34:54.478449 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:54.478381 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-kube-api-access-pnxmg" (OuterVolumeSpecName: "kube-api-access-pnxmg") pod "59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" (UID: "59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3"). InnerVolumeSpecName "kube-api-access-pnxmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:34:54.577065 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:54.577026 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pnxmg\" (UniqueName: \"kubernetes.io/projected/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-kube-api-access-pnxmg\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:34:54.577065 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:54.577060 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-kserve-provision-location\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:34:54.577065 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:54.577073 2571 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-88982-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-raw-sklearn-88982-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:34:54.577305 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:54.577085 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3-proxy-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:34:55.142519 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.142485 2571 generic.go:358] "Generic (PLEG): container finished" podID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerID="4f8a50fc4404a459d9a46785fa5d73a7ed37a013a168422c8e0681d9946b6754" exitCode=0 Apr 16 22:34:55.142820 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.142589 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" Apr 16 22:34:55.142820 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.142588 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" event={"ID":"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3","Type":"ContainerDied","Data":"4f8a50fc4404a459d9a46785fa5d73a7ed37a013a168422c8e0681d9946b6754"} Apr 16 22:34:55.142820 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.142629 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z" event={"ID":"59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3","Type":"ContainerDied","Data":"b44de7c82020217086dc1fc54c4e22868268488f1a3a9ff4ffbf26eb49f3f725"} Apr 16 22:34:55.142820 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.142649 2571 scope.go:117] "RemoveContainer" containerID="cb763e541b64a9f21de6e4ad2c8b77322a47c82ec74242b792b2eade8b5effc9" Apr 16 22:34:55.144134 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.144110 2571 generic.go:358] "Generic (PLEG): container finished" podID="0a57ec34-42f2-4076-b276-b514190f0400" containerID="7522cafc3670319e855126125eba7712746821755df0a4a668ff40ff95965428" exitCode=0 Apr 16 22:34:55.144253 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.144184 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" event={"ID":"0a57ec34-42f2-4076-b276-b514190f0400","Type":"ContainerDied","Data":"7522cafc3670319e855126125eba7712746821755df0a4a668ff40ff95965428"} Apr 16 22:34:55.151748 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.151722 2571 scope.go:117] "RemoveContainer" containerID="4f8a50fc4404a459d9a46785fa5d73a7ed37a013a168422c8e0681d9946b6754" Apr 16 22:34:55.159210 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.159190 2571 scope.go:117] "RemoveContainer" containerID="456bf2ad0d743b07d85bd408616cb13e3b1e5a94800ea7c2ecbe94bc030aa36c" Apr 16 22:34:55.178109 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.178083 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z"] Apr 16 22:34:55.179239 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.178627 2571 scope.go:117] "RemoveContainer" containerID="cb763e541b64a9f21de6e4ad2c8b77322a47c82ec74242b792b2eade8b5effc9" Apr 16 22:34:55.179239 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:34:55.179194 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb763e541b64a9f21de6e4ad2c8b77322a47c82ec74242b792b2eade8b5effc9\": container with ID starting with cb763e541b64a9f21de6e4ad2c8b77322a47c82ec74242b792b2eade8b5effc9 not found: ID does not exist" containerID="cb763e541b64a9f21de6e4ad2c8b77322a47c82ec74242b792b2eade8b5effc9" Apr 16 22:34:55.179239 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.179226 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb763e541b64a9f21de6e4ad2c8b77322a47c82ec74242b792b2eade8b5effc9"} err="failed to get container status \"cb763e541b64a9f21de6e4ad2c8b77322a47c82ec74242b792b2eade8b5effc9\": rpc error: code = NotFound desc = could not find container \"cb763e541b64a9f21de6e4ad2c8b77322a47c82ec74242b792b2eade8b5effc9\": container with ID starting with cb763e541b64a9f21de6e4ad2c8b77322a47c82ec74242b792b2eade8b5effc9 not found: ID does not exist" Apr 16 22:34:55.179461 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.179247 2571 scope.go:117] "RemoveContainer" containerID="4f8a50fc4404a459d9a46785fa5d73a7ed37a013a168422c8e0681d9946b6754" Apr 16 22:34:55.179574 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:34:55.179533 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f8a50fc4404a459d9a46785fa5d73a7ed37a013a168422c8e0681d9946b6754\": container with ID starting with 4f8a50fc4404a459d9a46785fa5d73a7ed37a013a168422c8e0681d9946b6754 not found: ID does not exist" containerID="4f8a50fc4404a459d9a46785fa5d73a7ed37a013a168422c8e0681d9946b6754" Apr 16 22:34:55.179625 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.179585 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8a50fc4404a459d9a46785fa5d73a7ed37a013a168422c8e0681d9946b6754"} err="failed to get container status \"4f8a50fc4404a459d9a46785fa5d73a7ed37a013a168422c8e0681d9946b6754\": rpc error: code = NotFound desc = could not find container \"4f8a50fc4404a459d9a46785fa5d73a7ed37a013a168422c8e0681d9946b6754\": container with ID starting with 4f8a50fc4404a459d9a46785fa5d73a7ed37a013a168422c8e0681d9946b6754 not found: ID does not exist" Apr 16 22:34:55.179625 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.179604 2571 scope.go:117] "RemoveContainer" containerID="456bf2ad0d743b07d85bd408616cb13e3b1e5a94800ea7c2ecbe94bc030aa36c" Apr 16 22:34:55.179849 ip-10-0-129-102 kubenswrapper[2571]: E0416 22:34:55.179835 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456bf2ad0d743b07d85bd408616cb13e3b1e5a94800ea7c2ecbe94bc030aa36c\": container with ID starting with 456bf2ad0d743b07d85bd408616cb13e3b1e5a94800ea7c2ecbe94bc030aa36c not found: ID does not exist" containerID="456bf2ad0d743b07d85bd408616cb13e3b1e5a94800ea7c2ecbe94bc030aa36c" Apr 16 22:34:55.180030 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.179851 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456bf2ad0d743b07d85bd408616cb13e3b1e5a94800ea7c2ecbe94bc030aa36c"} err="failed to get container status \"456bf2ad0d743b07d85bd408616cb13e3b1e5a94800ea7c2ecbe94bc030aa36c\": rpc error: code = NotFound desc = could not find container \"456bf2ad0d743b07d85bd408616cb13e3b1e5a94800ea7c2ecbe94bc030aa36c\": container with ID starting with 456bf2ad0d743b07d85bd408616cb13e3b1e5a94800ea7c2ecbe94bc030aa36c not found: ID does not exist" Apr 16 22:34:55.180331 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.180313 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-88982-predictor-5576d8bccc-6bw9z"] Apr 16 22:34:55.679220 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:55.679181 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" path="/var/lib/kubelet/pods/59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3/volumes" Apr 16 22:34:56.149313 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:56.149273 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" event={"ID":"0a57ec34-42f2-4076-b276-b514190f0400","Type":"ContainerStarted","Data":"6d91aed6faf2f0fa3cc7a0fa5bf54858f8a66cf91604c09d94ee1790ef6d628b"} Apr 16 22:34:56.149313 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:56.149316 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" event={"ID":"0a57ec34-42f2-4076-b276-b514190f0400","Type":"ContainerStarted","Data":"48c90c9264a1c776534bd2c079a2fe0c62c138a94b5e3cdeea4cd115b12fc6f1"} Apr 16 22:34:56.149613 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:56.149595 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:56.169513 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:56.169456 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" podStartSLOduration=7.169437477 podStartE2EDuration="7.169437477s" podCreationTimestamp="2026-04-16 22:34:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:34:56.166863549 +0000 UTC m=+1279.098188411" watchObservedRunningTime="2026-04-16 22:34:56.169437477 +0000 UTC m=+1279.100762343" Apr 16 22:34:57.153471 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:57.153441 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:34:57.154810 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:57.154782 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 22:34:58.156708 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:34:58.156674 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 22:35:03.161585 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:35:03.161528 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:35:03.162176 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:35:03.162146 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 22:35:13.162245 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:35:13.162186 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 22:35:23.163101 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:35:23.163046 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 22:35:33.162433 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:35:33.162393 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 22:35:43.163013 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:35:43.162969 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 22:35:53.162506 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:35:53.162465 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 22:36:03.162704 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:03.162675 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:36:09.786312 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:09.786265 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk"] Apr 16 22:36:09.786819 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:09.786733 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kserve-container" containerID="cri-o://48c90c9264a1c776534bd2c079a2fe0c62c138a94b5e3cdeea4cd115b12fc6f1" gracePeriod=30 Apr 16 22:36:09.786894 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:09.786785 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kube-rbac-proxy" containerID="cri-o://6d91aed6faf2f0fa3cc7a0fa5bf54858f8a66cf91604c09d94ee1790ef6d628b" gracePeriod=30 Apr 16 22:36:10.410352 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:10.410319 2571 generic.go:358] "Generic (PLEG): container finished" podID="0a57ec34-42f2-4076-b276-b514190f0400" containerID="6d91aed6faf2f0fa3cc7a0fa5bf54858f8a66cf91604c09d94ee1790ef6d628b" exitCode=2 Apr 16 22:36:10.410528 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:10.410393 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" event={"ID":"0a57ec34-42f2-4076-b276-b514190f0400","Type":"ContainerDied","Data":"6d91aed6faf2f0fa3cc7a0fa5bf54858f8a66cf91604c09d94ee1790ef6d628b"} Apr 16 22:36:13.157164 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:13.157121 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 16 22:36:13.162745 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:13.162712 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 22:36:14.426868 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.426838 2571 generic.go:358] "Generic (PLEG): container finished" podID="0a57ec34-42f2-4076-b276-b514190f0400" containerID="48c90c9264a1c776534bd2c079a2fe0c62c138a94b5e3cdeea4cd115b12fc6f1" exitCode=0 Apr 16 22:36:14.427221 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.426910 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" event={"ID":"0a57ec34-42f2-4076-b276-b514190f0400","Type":"ContainerDied","Data":"48c90c9264a1c776534bd2c079a2fe0c62c138a94b5e3cdeea4cd115b12fc6f1"} Apr 16 22:36:14.427221 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.426956 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" event={"ID":"0a57ec34-42f2-4076-b276-b514190f0400","Type":"ContainerDied","Data":"904cdee7bff197f8da1f3d39973ef1e721deee5462652fd5c8f60d387803b1ae"} Apr 16 22:36:14.427221 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.426970 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="904cdee7bff197f8da1f3d39973ef1e721deee5462652fd5c8f60d387803b1ae" Apr 16 22:36:14.436867 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.436845 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:36:14.499690 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.499650 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a57ec34-42f2-4076-b276-b514190f0400-raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config\") pod \"0a57ec34-42f2-4076-b276-b514190f0400\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " Apr 16 22:36:14.499690 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.499689 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvw7x\" (UniqueName: \"kubernetes.io/projected/0a57ec34-42f2-4076-b276-b514190f0400-kube-api-access-dvw7x\") pod \"0a57ec34-42f2-4076-b276-b514190f0400\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " Apr 16 22:36:14.499959 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.499716 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a57ec34-42f2-4076-b276-b514190f0400-kserve-provision-location\") pod \"0a57ec34-42f2-4076-b276-b514190f0400\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " Apr 16 22:36:14.499959 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.499834 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a57ec34-42f2-4076-b276-b514190f0400-proxy-tls\") pod \"0a57ec34-42f2-4076-b276-b514190f0400\" (UID: \"0a57ec34-42f2-4076-b276-b514190f0400\") " Apr 16 22:36:14.500101 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.500067 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a57ec34-42f2-4076-b276-b514190f0400-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0a57ec34-42f2-4076-b276-b514190f0400" (UID: "0a57ec34-42f2-4076-b276-b514190f0400"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:36:14.500156 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.500105 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a57ec34-42f2-4076-b276-b514190f0400-raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config") pod "0a57ec34-42f2-4076-b276-b514190f0400" (UID: "0a57ec34-42f2-4076-b276-b514190f0400"). InnerVolumeSpecName "raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:36:14.501859 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.501828 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a57ec34-42f2-4076-b276-b514190f0400-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0a57ec34-42f2-4076-b276-b514190f0400" (UID: "0a57ec34-42f2-4076-b276-b514190f0400"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:36:14.501859 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.501850 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a57ec34-42f2-4076-b276-b514190f0400-kube-api-access-dvw7x" (OuterVolumeSpecName: "kube-api-access-dvw7x") pod "0a57ec34-42f2-4076-b276-b514190f0400" (UID: "0a57ec34-42f2-4076-b276-b514190f0400"). InnerVolumeSpecName "kube-api-access-dvw7x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:36:14.600919 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.600863 2571 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a57ec34-42f2-4076-b276-b514190f0400-proxy-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:36:14.600919 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.600915 2571 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a57ec34-42f2-4076-b276-b514190f0400-raw-sklearn-runtime-8cb92-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:36:14.600919 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.600932 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dvw7x\" (UniqueName: \"kubernetes.io/projected/0a57ec34-42f2-4076-b276-b514190f0400-kube-api-access-dvw7x\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:36:14.601178 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:14.600947 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a57ec34-42f2-4076-b276-b514190f0400-kserve-provision-location\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 16 22:36:15.430763 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:15.430730 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk" Apr 16 22:36:15.451514 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:15.451476 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk"] Apr 16 22:36:15.456099 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:15.456068 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8cb92-predictor-7cf4f8cbfc-2jtpk"] Apr 16 22:36:15.679347 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:15.679310 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a57ec34-42f2-4076-b276-b514190f0400" path="/var/lib/kubelet/pods/0a57ec34-42f2-4076-b276-b514190f0400/volumes" Apr 16 22:36:38.126589 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:38.126536 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jng42_a88aba50-e875-413f-a0d8-0887fad52a8e/global-pull-secret-syncer/0.log" Apr 16 22:36:38.193777 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:38.193741 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-5bljr_eb3ceb69-c9ef-44b5-bfc3-0c42f2d8d502/konnectivity-agent/0.log" Apr 16 22:36:38.246663 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:38.246629 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-102.ec2.internal_432a54920ff69b032f406403f8e82323/haproxy/0.log" Apr 16 22:36:41.762668 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:41.762637 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03b733a7-476c-4e64-86e3-41c50662d4d1/alertmanager/0.log" Apr 16 22:36:41.785799 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:41.785775 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03b733a7-476c-4e64-86e3-41c50662d4d1/config-reloader/0.log" Apr 16 22:36:41.806257 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:41.806226 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03b733a7-476c-4e64-86e3-41c50662d4d1/kube-rbac-proxy-web/0.log" Apr 16 22:36:41.827838 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:41.827806 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03b733a7-476c-4e64-86e3-41c50662d4d1/kube-rbac-proxy/0.log" Apr 16 22:36:41.852675 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:41.852652 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03b733a7-476c-4e64-86e3-41c50662d4d1/kube-rbac-proxy-metric/0.log" Apr 16 22:36:41.873042 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:41.873017 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03b733a7-476c-4e64-86e3-41c50662d4d1/prom-label-proxy/0.log" Apr 16 22:36:41.892379 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:41.892353 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03b733a7-476c-4e64-86e3-41c50662d4d1/init-config-reloader/0.log" Apr 16 22:36:41.924097 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:41.924067 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-kr46t_8fbcc451-603b-46ab-be67-fab197b0645c/cluster-monitoring-operator/0.log" Apr 16 22:36:42.019330 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:42.019225 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5fdbdc7cff-g759t_9e97edd7-009d-4e18-9786-f18ece2787a7/metrics-server/0.log" Apr 16 22:36:42.043716 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:42.043687 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-46xrd_48b33455-80b7-4a5e-9249-bdd9508d2074/monitoring-plugin/0.log" Apr 16 22:36:42.155195 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:42.155167 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lwbcr_9e414258-8f48-4834-bd1d-b4fc85272d4f/node-exporter/0.log" Apr 16 22:36:42.174382 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:42.174358 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lwbcr_9e414258-8f48-4834-bd1d-b4fc85272d4f/kube-rbac-proxy/0.log" Apr 16 22:36:42.198988 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:42.198969 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lwbcr_9e414258-8f48-4834-bd1d-b4fc85272d4f/init-textfile/0.log" Apr 16 22:36:42.290061 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:42.289982 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hrwj8_2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b/kube-rbac-proxy-main/0.log" Apr 16 22:36:42.311684 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:42.311651 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hrwj8_2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b/kube-rbac-proxy-self/0.log" Apr 16 22:36:42.331215 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:42.331181 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hrwj8_2faeab6a-a0ee-4f0e-a6f3-b39f1a09e72b/openshift-state-metrics/0.log" Apr 16 22:36:42.569268 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:42.569188 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-95ddb_6b034329-a956-4a82-8545-11a6b8cfddd3/prometheus-operator-admission-webhook/0.log" Apr 16 22:36:44.356038 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:44.356007 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/1.log" Apr 16 22:36:44.360189 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:44.360160 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ktkhc_c1491aea-f867-4bd4-ab58-776381aad953/console-operator/2.log" Apr 16 22:36:44.738441 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:44.738411 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-spq2q_240bb924-5548-44ac-aab5-ffc41dca2bf6/download-server/0.log" Apr 16 22:36:45.283051 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283017 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5"] Apr 16 22:36:45.283409 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283397 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="storage-initializer" Apr 16 22:36:45.283458 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283411 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="storage-initializer" Apr 16 22:36:45.283458 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283421 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kube-rbac-proxy" Apr 16 22:36:45.283458 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283427 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kube-rbac-proxy" Apr 16 22:36:45.283458 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283434 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kube-rbac-proxy" Apr 16 22:36:45.283458 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283440 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kube-rbac-proxy" Apr 16 22:36:45.283635 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283461 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="storage-initializer" Apr 16 22:36:45.283635 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283467 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="storage-initializer" Apr 16 22:36:45.283635 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283474 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kserve-container" Apr 16 22:36:45.283635 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283479 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kserve-container" Apr 16 22:36:45.283635 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283487 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kserve-container" Apr 16 22:36:45.283635 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283492 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kserve-container" Apr 16 22:36:45.283635 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283542 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kube-rbac-proxy" Apr 16 22:36:45.283635 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283568 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="59e9a1e4-970f-4f93-96b0-8b9e10c3a5e3" containerName="kserve-container" Apr 16 22:36:45.283635 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283577 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kube-rbac-proxy" Apr 16 22:36:45.283635 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.283586 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a57ec34-42f2-4076-b276-b514190f0400" containerName="kserve-container" Apr 16 22:36:45.286650 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.286627 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.289018 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.288996 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wls74\"/\"kube-root-ca.crt\"" Apr 16 22:36:45.289971 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.289945 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wls74\"/\"openshift-service-ca.crt\"" Apr 16 22:36:45.289971 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.289965 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wls74\"/\"default-dockercfg-zkhg7\"" Apr 16 22:36:45.294351 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.294326 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5"] Apr 16 22:36:45.369106 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.369070 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-proc\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.369106 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.369112 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-podres\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.369533 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.369182 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zndgd\" (UniqueName: \"kubernetes.io/projected/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-kube-api-access-zndgd\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.369533 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.369212 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-sys\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.369533 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.369232 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-lib-modules\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.470093 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.470056 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zndgd\" (UniqueName: \"kubernetes.io/projected/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-kube-api-access-zndgd\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.470286 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.470111 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-sys\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.470286 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.470145 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-lib-modules\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.470286 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.470196 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-proc\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.470286 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.470229 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-podres\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.470286 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.470257 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-sys\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.470488 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.470286 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-proc\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.470488 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.470325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-lib-modules\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.470488 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.470363 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-podres\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.477952 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.477933 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zndgd\" (UniqueName: \"kubernetes.io/projected/0eaaa144-e7a8-4613-b9e5-87fd6f22c453-kube-api-access-zndgd\") pod \"perf-node-gather-daemonset-jwhg5\" (UID: \"0eaaa144-e7a8-4613-b9e5-87fd6f22c453\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.598337 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.598229 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:45.724998 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.724962 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5"] Apr 16 22:36:45.728042 ip-10-0-129-102 kubenswrapper[2571]: W0416 22:36:45.728005 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0eaaa144_e7a8_4613_b9e5_87fd6f22c453.slice/crio-246048a2d25002deb6e5fc0337b2eb2a7d65f044dc1657a88da478a655592642 WatchSource:0}: Error finding container 246048a2d25002deb6e5fc0337b2eb2a7d65f044dc1657a88da478a655592642: Status 404 returned error can't find the container with id 246048a2d25002deb6e5fc0337b2eb2a7d65f044dc1657a88da478a655592642 Apr 16 22:36:45.815388 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.815353 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wcg8z_011ad47b-a64a-4697-8f37-02cbc931d548/dns/0.log" Apr 16 22:36:45.834693 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.834670 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wcg8z_011ad47b-a64a-4697-8f37-02cbc931d548/kube-rbac-proxy/0.log" Apr 16 22:36:45.927378 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:45.927299 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bz2qc_7eb428a4-211d-4444-991e-d8b3dac28ddf/dns-node-resolver/0.log" Apr 16 22:36:46.341225 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:46.341194 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4mcb4_5c62daff-6789-4383-b4d0-6b51a07c06bb/node-ca/0.log" Apr 16 22:36:46.539450 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:46.539405 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" event={"ID":"0eaaa144-e7a8-4613-b9e5-87fd6f22c453","Type":"ContainerStarted","Data":"dba00d33b5162424cdb40e0006a231cdc125373f37a437d231bfc5a250e72bd7"} Apr 16 22:36:46.539450 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:46.539448 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" event={"ID":"0eaaa144-e7a8-4613-b9e5-87fd6f22c453","Type":"ContainerStarted","Data":"246048a2d25002deb6e5fc0337b2eb2a7d65f044dc1657a88da478a655592642"} Apr 16 22:36:46.539890 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:46.539520 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:46.555307 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:46.555261 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" podStartSLOduration=1.555245169 podStartE2EDuration="1.555245169s" podCreationTimestamp="2026-04-16 22:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:36:46.55359664 +0000 UTC m=+1389.484921502" watchObservedRunningTime="2026-04-16 22:36:46.555245169 +0000 UTC m=+1389.486570030" Apr 16 22:36:47.429203 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:47.429171 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gv9kf_9b954b07-4736-4bf5-a073-457f98c06525/serve-healthcheck-canary/0.log" Apr 16 22:36:47.854234 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:47.854209 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lmj7s_f55d1b2b-7070-4ea9-a110-9789e41cd235/kube-rbac-proxy/0.log" Apr 16 22:36:47.874036 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:47.874007 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lmj7s_f55d1b2b-7070-4ea9-a110-9789e41cd235/exporter/0.log" Apr 16 22:36:47.894773 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:47.894739 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lmj7s_f55d1b2b-7070-4ea9-a110-9789e41cd235/extractor/0.log" Apr 16 22:36:49.902042 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:49.902014 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84d7d5cfc6-snfdj_4cbc9c3f-0ce4-4d17-959b-3b8d2b50fb9f/manager/0.log" Apr 16 22:36:49.943406 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:49.943375 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-89t7g_edcb5ce6-dde1-4d26-bd47-0da4c3382dca/server/0.log" Apr 16 22:36:50.023230 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:50.023193 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-4k48d_7b3ebba6-88e3-47e5-bbea-2c83998dd7c7/manager/0.log" Apr 16 22:36:50.068626 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:50.068596 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-h4xp9_08407eaf-053b-482d-958c-ee5b0a4357bd/seaweedfs/0.log" Apr 16 22:36:52.553707 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:52.553679 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-jwhg5" Apr 16 22:36:53.584140 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:53.584112 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-7g4vw_adbbb6b8-3903-47df-997c-6351eef0c24d/migrator/0.log" Apr 16 22:36:53.603389 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:53.603364 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-7g4vw_adbbb6b8-3903-47df-997c-6351eef0c24d/graceful-termination/0.log" Apr 16 22:36:55.357597 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:55.357493 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fd5n6_efd08e10-5da2-4f18-afe6-c78ed9bde562/kube-multus-additional-cni-plugins/0.log" Apr 16 22:36:55.380707 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:55.380676 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fd5n6_efd08e10-5da2-4f18-afe6-c78ed9bde562/egress-router-binary-copy/0.log" Apr 16 22:36:55.401713 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:55.401679 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fd5n6_efd08e10-5da2-4f18-afe6-c78ed9bde562/cni-plugins/0.log" Apr 16 22:36:55.421285 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:55.421258 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fd5n6_efd08e10-5da2-4f18-afe6-c78ed9bde562/bond-cni-plugin/0.log" Apr 16 22:36:55.442671 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:55.442649 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fd5n6_efd08e10-5da2-4f18-afe6-c78ed9bde562/routeoverride-cni/0.log" Apr 16 22:36:55.462889 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:55.462862 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fd5n6_efd08e10-5da2-4f18-afe6-c78ed9bde562/whereabouts-cni-bincopy/0.log" Apr 16 22:36:55.482257 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:55.482238 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fd5n6_efd08e10-5da2-4f18-afe6-c78ed9bde562/whereabouts-cni/0.log" Apr 16 22:36:55.512898 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:55.512875 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-r6vhd_7de19038-d2d6-4b61-acee-01b1d7fed4e2/kube-multus/0.log" Apr 16 22:36:55.563088 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:55.563044 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2f4gk_e24f7f3c-00b2-43d5-9a49-1b7ee75125a1/network-metrics-daemon/0.log" Apr 16 22:36:55.582249 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:55.582220 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2f4gk_e24f7f3c-00b2-43d5-9a49-1b7ee75125a1/kube-rbac-proxy/0.log" Apr 16 22:36:56.667863 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:56.667835 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8trxs_401b401e-f58b-4d1a-ac91-0376c9ee48ff/ovn-controller/0.log" Apr 16 22:36:56.691241 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:56.691214 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8trxs_401b401e-f58b-4d1a-ac91-0376c9ee48ff/ovn-acl-logging/0.log" Apr 16 22:36:56.711503 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:56.711478 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8trxs_401b401e-f58b-4d1a-ac91-0376c9ee48ff/kube-rbac-proxy-node/0.log" Apr 16 22:36:56.734970 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:56.734889 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8trxs_401b401e-f58b-4d1a-ac91-0376c9ee48ff/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 22:36:56.751824 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:56.751791 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8trxs_401b401e-f58b-4d1a-ac91-0376c9ee48ff/northd/0.log" Apr 16 22:36:56.770776 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:56.770744 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8trxs_401b401e-f58b-4d1a-ac91-0376c9ee48ff/nbdb/0.log" Apr 16 22:36:56.790022 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:56.789994 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8trxs_401b401e-f58b-4d1a-ac91-0376c9ee48ff/sbdb/0.log" Apr 16 22:36:56.883363 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:56.883330 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8trxs_401b401e-f58b-4d1a-ac91-0376c9ee48ff/ovnkube-controller/0.log" Apr 16 22:36:58.126703 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:58.126670 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-7hlrt_a3be75ea-4ff5-4062-8e69-d6fdf589b369/check-endpoints/0.log" Apr 16 22:36:58.171641 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:58.171602 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-kh55g_c06150de-115f-4d2c-8b2c-ec356592e26f/network-check-target-container/0.log" Apr 16 22:36:59.070490 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:59.070459 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-6rfdq_7547aebf-2698-4a15-952f-2dc060a10282/iptables-alerter/0.log" Apr 16 22:36:59.686351 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:36:59.686321 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-hfs2p_3bd79e7a-3b44-45b8-aefb-daaeaf2abb75/tuned/0.log" Apr 16 22:37:01.272328 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:37:01.272240 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-6jtg5_29cdf665-3a36-4a09-a77b-299bff99a6ac/cluster-samples-operator/0.log" Apr 16 22:37:01.287536 ip-10-0-129-102 kubenswrapper[2571]: I0416 22:37:01.287508 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-6jtg5_29cdf665-3a36-4a09-a77b-299bff99a6ac/cluster-samples-operator-watch/0.log"