Apr 16 18:30:37.938042 ip-10-0-140-1 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:30:38.372812 ip-10-0-140-1 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:38.372812 ip-10-0-140-1 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:30:38.372812 ip-10-0-140-1 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:38.372812 ip-10-0-140-1 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:30:38.372812 ip-10-0-140-1 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:38.375571 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.375479 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:30:38.383051 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383025 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:38.383051 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383043 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:38.383051 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383048 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:38.383051 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383052 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:38.383051 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383056 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:38.383051 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383060 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383065 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383069 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383072 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383076 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383083 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383088 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383092 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383096 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383101 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383105 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383109 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383112 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383116 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383120 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383123 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383129 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383136 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:38.383376 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383141 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383145 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383150 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383154 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383158 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383162 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383165 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383169 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383173 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383177 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383181 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383185 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383190 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383194 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383198 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383202 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383207 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383212 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383216 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383220 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:38.384117 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383225 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383229 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383235 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383240 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383244 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383248 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383252 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383256 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383260 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383265 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383269 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383273 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383277 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383281 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383286 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383290 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383294 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383298 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383303 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383308 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:38.384855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383312 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383316 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383320 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383325 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383329 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383333 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383337 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383341 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383345 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383350 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383354 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383358 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383363 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383367 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383371 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383376 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383382 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383386 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383390 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383394 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:38.385449 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383399 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383403 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.383407 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384082 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384091 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384096 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384102 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384109 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384114 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384118 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384123 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384127 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384132 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384136 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384139 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384144 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384148 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384153 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384157 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:38.386316 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384161 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384166 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384170 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384174 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384178 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384183 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384187 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384191 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384196 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384201 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384206 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384210 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384213 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384218 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384222 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384226 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384230 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384235 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384240 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384244 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:38.386944 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384248 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384253 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384257 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384262 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384267 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384273 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384277 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384282 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384287 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384291 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384295 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384299 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384303 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384307 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384312 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384315 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384320 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384324 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384328 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384332 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:38.387584 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384361 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384369 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384374 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384379 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384383 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384388 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384392 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384396 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384400 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384405 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384409 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384413 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384417 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384421 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384425 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384432 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384438 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384450 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384454 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384459 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:38.388221 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384464 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384468 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384473 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384477 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384481 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384486 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384490 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384494 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384498 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.384503 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385188 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385203 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385212 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385219 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385229 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385235 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385249 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385256 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385262 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385267 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385273 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:30:38.388855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385279 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385284 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385289 2577 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385294 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385299 2577 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385304 2577 flags.go:64] FLAG: --cloud-config="" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385309 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385315 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385325 2577 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385330 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385335 2577 flags.go:64] FLAG: --config-dir="" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385340 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385345 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385351 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385357 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385362 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385367 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385372 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385377 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385382 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385387 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385392 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385399 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385404 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385408 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:30:38.389362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385413 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385419 2577 flags.go:64] FLAG: --enable-server="true" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385424 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385431 2577 flags.go:64] FLAG: --event-burst="100" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385436 2577 flags.go:64] FLAG: --event-qps="50" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385441 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385446 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385451 2577 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385457 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385461 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385466 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385471 2577 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385476 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385481 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385487 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385493 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385498 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385502 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385507 2577 flags.go:64] FLAG: --feature-gates="" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385513 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385518 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385523 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385529 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385534 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385539 2577 flags.go:64] FLAG: --help="false" Apr 16 18:30:38.390115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385544 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385549 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385554 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385559 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385565 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385571 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385576 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385581 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385586 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385591 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385596 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385602 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385607 2577 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385612 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385616 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385621 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385626 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385631 2577 flags.go:64] FLAG: --lock-file="" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385635 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385640 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385645 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385654 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385661 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385666 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:30:38.390731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385670 2577 flags.go:64] FLAG: --logging-format="text" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385675 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385681 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385685 2577 flags.go:64] FLAG: --manifest-url="" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385690 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385697 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385702 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385708 2577 flags.go:64] FLAG: --max-pods="110" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385713 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385718 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385723 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385735 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385741 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385751 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385756 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385784 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385789 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385795 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385801 2577 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385806 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385814 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385819 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385824 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385829 2577 flags.go:64] FLAG: --port="10250" Apr 16 18:30:38.391356 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385834 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385839 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-037dbf3e5c173f7d0" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385844 2577 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385849 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385854 2577 flags.go:64] FLAG: --register-node="true" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385859 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385863 2577 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385872 2577 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385878 2577 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385882 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385887 2577 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385893 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385897 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385902 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385907 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385912 2577 flags.go:64] FLAG: --runonce="false" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385917 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385922 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385927 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385931 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385936 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385942 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385948 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385953 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385957 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385962 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:30:38.391985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385967 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385973 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385978 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385982 2577 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385987 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.385996 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.386001 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.386006 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.386013 2577 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.386017 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.386022 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.386027 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.386032 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.386038 2577 flags.go:64] FLAG: --v="2" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.386045 2577 flags.go:64] FLAG: --version="false" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.386052 2577 flags.go:64] FLAG: --vmodule="" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.386058 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.386064 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386212 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386219 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386223 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386228 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386232 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386237 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:38.392654 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386241 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386246 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386250 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386256 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386260 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386264 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386269 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386273 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386278 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386283 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386288 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386293 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386297 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386301 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386306 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386310 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386314 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386318 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386322 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386326 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:38.393261 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386330 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386335 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386339 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386343 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386347 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386352 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386356 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386361 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386365 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386369 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386375 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386381 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386385 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386389 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386394 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386400 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386404 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386408 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386413 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386417 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:38.393780 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386421 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386425 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386429 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386435 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386440 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386445 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386450 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386454 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386459 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386463 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386467 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386472 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386476 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386481 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386485 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386489 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386493 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386497 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386501 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:38.394277 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386506 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386510 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386515 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386519 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386523 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386530 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386536 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386541 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386548 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386552 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386557 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386562 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386567 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386571 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386575 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386579 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386584 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386589 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386594 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:38.394739 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386598 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.386602 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.387275 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.393915 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.393933 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.393981 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.393985 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.393989 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.393993 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.393996 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.393999 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394002 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394005 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394008 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394010 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394013 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:38.395255 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394016 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394019 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394022 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394024 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394027 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394030 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394033 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394036 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394041 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394043 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394047 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394049 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394052 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394055 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394058 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394061 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394063 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394066 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394069 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394071 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:38.395663 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394079 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394082 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394085 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394087 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394090 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394093 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394095 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394098 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394101 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394104 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394106 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394109 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394112 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394114 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394117 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394120 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394123 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394125 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394128 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394130 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:38.396185 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394133 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394135 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394138 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394141 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394144 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394146 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394149 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394151 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394154 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394156 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394159 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394161 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394165 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394168 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394171 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394173 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394176 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394178 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394181 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:38.396673 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394183 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394186 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394188 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394191 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394194 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394196 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394199 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394201 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394204 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394208 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394212 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394215 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394218 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394220 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394223 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:38.397160 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394226 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.394231 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394336 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394341 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394343 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394346 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394349 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394352 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394355 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394358 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394361 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394365 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394371 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394374 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394376 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394379 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:38.397530 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394382 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394384 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394387 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394390 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394392 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394395 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394398 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394400 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394403 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394406 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394408 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394411 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394414 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394417 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394419 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394422 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394424 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394427 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394429 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394432 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:38.397952 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394435 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394437 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394440 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394443 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394445 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394448 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394451 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394456 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394460 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394463 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394467 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394469 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394472 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394475 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394478 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394480 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394483 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394486 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394489 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:38.398464 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394491 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394494 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394497 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394499 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394502 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394504 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394507 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394510 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394512 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394515 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394517 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394520 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394522 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394525 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394527 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394530 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394532 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394535 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394537 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:38.398983 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394540 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394543 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394546 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394548 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394551 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394554 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394557 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394559 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394562 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394564 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394568 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394572 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394575 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:38.394578 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.394582 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:38.399438 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.395223 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:30:38.399819 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.397977 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:30:38.399819 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.399117 2577 server.go:1019] "Starting client certificate rotation" Apr 16 18:30:38.399819 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.399216 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:30:38.399819 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.399260 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:30:38.423544 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.423524 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:30:38.426363 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.426347 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:30:38.441861 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.441840 2577 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:30:38.447183 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.447166 2577 log.go:25] "Validated CRI v1 image API" Apr 16 18:30:38.449230 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.449212 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:30:38.453272 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.453251 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 ad5c2b28-1275-4baa-a06d-d491f3a9fe1c:/dev/nvme0n1p4 d1111f6b-0bb1-4164-b09a-b1df1b1d4a32:/dev/nvme0n1p3] Apr 16 18:30:38.453351 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.453271 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:30:38.456566 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.456544 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:30:38.459135 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.459019 2577 manager.go:217] Machine: {Timestamp:2026-04-16 18:30:38.457245057 +0000 UTC m=+0.402675759 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099299 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f28dcd9ca9483fc6f1de5c0ca0c33 SystemUUID:ec2f28dc-d9ca-9483-fc6f-1de5c0ca0c33 BootID:74d87ccb-c91a-407c-a794-dd2d96593081 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9e:89:a4:f3:5d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9e:89:a4:f3:5d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:95:0d:01:43:99 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:30:38.459135 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.459133 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:30:38.459266 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.459253 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:30:38.461416 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.461393 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:30:38.461552 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.461418 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-1.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:30:38.461599 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.461561 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:30:38.461599 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.461571 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:30:38.461599 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.461584 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:30:38.461599 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.461597 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:30:38.462414 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.462403 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:30:38.462523 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.462514 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:30:38.464954 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.464945 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:30:38.464996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.464958 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:30:38.464996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.464970 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:30:38.464996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.464979 2577 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:30:38.464996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.464989 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:30:38.466459 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.466447 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:30:38.466512 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.466470 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:30:38.470480 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.470463 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:30:38.472608 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.472246 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:30:38.473507 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.473495 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:30:38.473563 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.473514 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:30:38.473563 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.473524 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:30:38.473563 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.473533 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:30:38.473563 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.473541 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:30:38.473563 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.473549 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:30:38.473563 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.473555 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:30:38.473563 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.473561 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:30:38.473795 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.473569 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:30:38.473795 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.473575 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:30:38.473795 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.473584 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:30:38.473795 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.473593 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:30:38.474431 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.474419 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:30:38.474460 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.474432 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:30:38.477662 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.477641 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:30:38.477662 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.477653 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-1.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:30:38.477796 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.477653 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-1.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:30:38.478359 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.478347 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:30:38.478397 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.478379 2577 server.go:1295] "Started kubelet" Apr 16 18:30:38.478444 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.478430 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pg4hd" Apr 16 18:30:38.478634 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.478569 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:30:38.478732 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.478655 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:30:38.478832 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.478811 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:30:38.479190 ip-10-0-140-1 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:30:38.479826 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.479811 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:30:38.480602 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.480590 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:30:38.485709 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.484412 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-1.ec2.internal.18a6e9dd0235b96b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-1.ec2.internal,UID:ip-10-0-140-1.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-1.ec2.internal,},FirstTimestamp:2026-04-16 18:30:38.478358891 +0000 UTC m=+0.423789594,LastTimestamp:2026-04-16 18:30:38.478358891 +0000 UTC m=+0.423789594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-1.ec2.internal,}" Apr 16 18:30:38.486924 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.486902 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pg4hd" Apr 16 18:30:38.487461 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.487088 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:30:38.487555 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.487500 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:30:38.488311 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.488292 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:30:38.488380 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.488361 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:30:38.488979 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.488811 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:30:38.489072 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.489041 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:30:38.489072 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.489053 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:30:38.489903 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.489751 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:38.491198 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.491179 2577 factory.go:55] Registering systemd factory Apr 16 18:30:38.491198 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.491198 2577 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:30:38.491442 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.491426 2577 factory.go:153] Registering CRI-O factory Apr 16 18:30:38.491442 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.491444 2577 factory.go:223] Registration of the crio container factory successfully Apr 16 18:30:38.491582 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.491548 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:30:38.491582 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.491575 2577 factory.go:103] Registering Raw factory Apr 16 18:30:38.491681 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.491591 2577 manager.go:1196] Started watching for new ooms in manager Apr 16 18:30:38.492028 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.492006 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:30:38.492028 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.492028 2577 manager.go:319] Starting recovery of all containers Apr 16 18:30:38.492870 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.492846 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:38.496410 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.496384 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-140-1.ec2.internal\" not found" node="ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.501674 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.501659 2577 manager.go:324] Recovery completed Apr 16 18:30:38.505429 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.505417 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:38.507587 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.507570 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:38.507675 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.507604 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:38.507675 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.507620 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:38.508139 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.508126 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:30:38.508139 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.508136 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:30:38.508267 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.508151 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:30:38.510307 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.510293 2577 policy_none.go:49] "None policy: Start" Apr 16 18:30:38.510307 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.510309 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:30:38.510407 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.510319 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:30:38.554368 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.554350 2577 manager.go:341] "Starting Device Plugin manager" Apr 16 18:30:38.554675 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.554427 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:30:38.554675 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.554437 2577 server.go:85] "Starting device plugin registration server" Apr 16 18:30:38.554675 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.554657 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:30:38.554847 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.554670 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:30:38.554891 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.554844 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:30:38.554971 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.554957 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:30:38.554971 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.554971 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:30:38.555382 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.555362 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:30:38.555469 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.555402 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:38.611792 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.611745 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:30:38.613012 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.612995 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:30:38.613098 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.613022 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:30:38.613098 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.613041 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:30:38.613098 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.613048 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:30:38.613247 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.613098 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:30:38.615686 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.615668 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:38.655705 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.655657 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:38.656656 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.656642 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:38.656744 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.656668 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:38.656744 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.656678 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:38.656744 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.656702 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.663191 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.663173 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.663298 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.663198 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-1.ec2.internal\": node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:38.686506 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.686481 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:38.713522 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.713489 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-1.ec2.internal"] Apr 16 18:30:38.713579 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.713561 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:38.715073 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.715050 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:38.715153 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.715081 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:38.715153 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.715095 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:38.716362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.716348 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:38.716498 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.716481 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.716546 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.716513 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:38.717042 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.717023 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:38.717117 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.717028 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:38.717117 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.717080 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:38.717117 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.717096 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:38.717117 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.717053 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:38.717311 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.717136 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:38.718641 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.718625 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.718732 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.718655 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:38.719319 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.719295 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:38.719388 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.719322 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:38.719388 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.719334 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:38.741323 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.741306 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-1.ec2.internal\" not found" node="ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.745611 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.745594 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-1.ec2.internal\" not found" node="ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.786798 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.786761 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:38.791384 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.791365 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/88424cbdbf8cac342ecdd775f1456f96-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal\" (UID: \"88424cbdbf8cac342ecdd775f1456f96\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.791476 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.791422 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/88424cbdbf8cac342ecdd775f1456f96-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal\" (UID: \"88424cbdbf8cac342ecdd775f1456f96\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.791476 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.791453 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d652678662059d52536902d6dffe6ef4-config\") pod \"kube-apiserver-proxy-ip-10-0-140-1.ec2.internal\" (UID: \"d652678662059d52536902d6dffe6ef4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.887902 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.887873 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:38.892185 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.892171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/88424cbdbf8cac342ecdd775f1456f96-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal\" (UID: \"88424cbdbf8cac342ecdd775f1456f96\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.892254 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.892197 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/88424cbdbf8cac342ecdd775f1456f96-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal\" (UID: \"88424cbdbf8cac342ecdd775f1456f96\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.892293 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.892259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/88424cbdbf8cac342ecdd775f1456f96-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal\" (UID: \"88424cbdbf8cac342ecdd775f1456f96\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.892335 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.892303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d652678662059d52536902d6dffe6ef4-config\") pod \"kube-apiserver-proxy-ip-10-0-140-1.ec2.internal\" (UID: \"d652678662059d52536902d6dffe6ef4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.892385 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.892339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d652678662059d52536902d6dffe6ef4-config\") pod \"kube-apiserver-proxy-ip-10-0-140-1.ec2.internal\" (UID: \"d652678662059d52536902d6dffe6ef4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.892385 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:38.892370 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/88424cbdbf8cac342ecdd775f1456f96-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal\" (UID: \"88424cbdbf8cac342ecdd775f1456f96\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal" Apr 16 18:30:38.988648 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:38.988550 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:39.044901 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.044868 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal" Apr 16 18:30:39.048501 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.048483 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-1.ec2.internal" Apr 16 18:30:39.089406 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:39.089375 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:39.189815 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:39.189784 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:39.290345 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:39.290275 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:39.390750 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:39.390726 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:39.399004 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.398987 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:30:39.399125 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.399111 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:30:39.399170 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.399150 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:30:39.472867 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.472835 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:39.487486 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.487456 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:30:39.489493 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.489463 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:25:38 +0000 UTC" deadline="2027-12-28 21:50:34.438258764 +0000 UTC" Apr 16 18:30:39.489493 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.489490 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14907h19m54.948771725s" Apr 16 18:30:39.491616 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:39.491599 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:39.497632 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.497615 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:30:39.520517 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.520495 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-v5hbg" Apr 16 18:30:39.528480 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.528461 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-v5hbg" Apr 16 18:30:39.586549 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:39.586514 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd652678662059d52536902d6dffe6ef4.slice/crio-03faef3aa945a39bab5da3bc3711a645ac567bf4f401b45178af61a7ae654a7a WatchSource:0}: Error finding container 03faef3aa945a39bab5da3bc3711a645ac567bf4f401b45178af61a7ae654a7a: Status 404 returned error can't find the container with id 03faef3aa945a39bab5da3bc3711a645ac567bf4f401b45178af61a7ae654a7a Apr 16 18:30:39.586793 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:39.586756 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88424cbdbf8cac342ecdd775f1456f96.slice/crio-808c2216cc0e84cce09ae8dcbe769ea560a7290c3f66ac8385b85aa8f383a88f WatchSource:0}: Error finding container 808c2216cc0e84cce09ae8dcbe769ea560a7290c3f66ac8385b85aa8f383a88f: Status 404 returned error can't find the container with id 808c2216cc0e84cce09ae8dcbe769ea560a7290c3f66ac8385b85aa8f383a88f Apr 16 18:30:39.591736 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:39.591717 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:39.591818 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.591804 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:30:39.616257 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.616214 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-1.ec2.internal" event={"ID":"d652678662059d52536902d6dffe6ef4","Type":"ContainerStarted","Data":"03faef3aa945a39bab5da3bc3711a645ac567bf4f401b45178af61a7ae654a7a"} Apr 16 18:30:39.617119 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.617102 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal" event={"ID":"88424cbdbf8cac342ecdd775f1456f96","Type":"ContainerStarted","Data":"808c2216cc0e84cce09ae8dcbe769ea560a7290c3f66ac8385b85aa8f383a88f"} Apr 16 18:30:39.692493 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:39.692464 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:39.792975 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:39.792948 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:39.893525 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:39.893452 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-1.ec2.internal\" not found" Apr 16 18:30:39.893810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.893793 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:39.988672 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:39.988634 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal" Apr 16 18:30:40.002207 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.002182 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:30:40.003624 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.003599 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-1.ec2.internal" Apr 16 18:30:40.014124 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.014099 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:30:40.392594 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.392559 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:40.465938 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.465910 2577 apiserver.go:52] "Watching apiserver" Apr 16 18:30:40.474726 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.474699 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:30:40.475199 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.475171 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-mw7zk","kube-system/kube-apiserver-proxy-ip-10-0-140-1.ec2.internal","openshift-image-registry/node-ca-7rdcc","openshift-multus/multus-9fkg9","openshift-multus/network-metrics-daemon-tldk9","openshift-network-diagnostics/network-check-target-57qhk","openshift-network-operator/iptables-alerter-mmhj6","openshift-ovn-kubernetes/ovnkube-node-dsp8f","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp","openshift-cluster-node-tuning-operator/tuned-lgzq4","openshift-dns/node-resolver-7jl4x","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal","openshift-multus/multus-additional-cni-plugins-49rzx"] Apr 16 18:30:40.476883 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.476739 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.479685 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.479535 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:30:40.479685 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.479564 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-sdrgb\"" Apr 16 18:30:40.479685 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.479573 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:30:40.479904 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.479690 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7rdcc" Apr 16 18:30:40.482736 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.481958 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-s6bzs\"" Apr 16 18:30:40.482736 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.482101 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:30:40.482736 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.482207 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.482736 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.482215 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:30:40.482736 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.482509 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:30:40.484077 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.484054 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:40.484194 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:40.484157 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:30:40.484280 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.484257 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.485307 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.484844 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:30:40.485307 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.484910 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:30:40.485307 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.484921 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xkbdg\"" Apr 16 18:30:40.486432 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.486415 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.487411 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.486880 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qq9d8\"" Apr 16 18:30:40.487411 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.487096 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:30:40.487411 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.487268 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:30:40.487411 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.487312 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:30:40.487411 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.487333 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:30:40.487735 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.487517 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:30:40.487735 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.487614 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:30:40.488353 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.488017 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mw7zk" Apr 16 18:30:40.488840 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.488821 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:30:40.488924 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.488837 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:30:40.489191 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.489173 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:30:40.489255 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.489223 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:30:40.489433 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.489405 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:30:40.489433 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.489416 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qz94d\"" Apr 16 18:30:40.489590 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.489579 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7jl4x" Apr 16 18:30:40.489996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.489978 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:30:40.490325 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.490304 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-947w9\"" Apr 16 18:30:40.490573 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.490556 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:30:40.490929 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.490910 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.491538 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.491522 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-xnp2l\"" Apr 16 18:30:40.491960 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.491941 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:30:40.492064 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.492038 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:30:40.492505 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.492484 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:40.492665 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:40.492635 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:30:40.492738 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.492727 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:30:40.493197 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.492887 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xn4bx\"" Apr 16 18:30:40.493197 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.493178 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:30:40.494405 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.494384 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mmhj6" Apr 16 18:30:40.496166 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.496143 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-wg4gm\"" Apr 16 18:30:40.496257 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.496181 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:30:40.496501 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.496484 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:30:40.496664 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.496450 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:30:40.498554 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.498525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-multus-cni-dir\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.498653 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.498564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-var-lib-cni-bin\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.498653 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.498591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-sysctl-conf\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.498756 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.498652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-lib-modules\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.498756 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.498687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-multus-socket-dir-parent\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.498996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.498783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-var-lib-kubelet\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.498996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.498811 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-hostroot\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.498996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.498853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97m9d\" (UniqueName: \"kubernetes.io/projected/b18d081b-3d3f-48e8-8f52-8eb619b60b77-kube-api-access-97m9d\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.498996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.498878 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-var-lib-cni-multus\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.498996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.498929 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b18d081b-3d3f-48e8-8f52-8eb619b60b77-ovnkube-script-lib\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.498996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.498954 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s4wj\" (UniqueName: \"kubernetes.io/projected/b86bb118-f0ab-4605-860a-df81a23f9124-kube-api-access-5s4wj\") pod \"node-resolver-7jl4x\" (UID: \"b86bb118-f0ab-4605-860a-df81a23f9124\") " pod="openshift-dns/node-resolver-7jl4x" Apr 16 18:30:40.498996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.498992 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6e091b04-e5d1-4928-9203-5358e7ad1e2a-tmp\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.499329 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499015 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-kubelet\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.499329 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499038 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-systemd-units\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.499329 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499073 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-var-lib-openvswitch\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.499329 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499285 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-run-openvswitch\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.499596 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499335 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tzzn\" (UniqueName: \"kubernetes.io/projected/8f66e95f-32ea-4c62-b967-18110b01aac3-kube-api-access-9tzzn\") pod \"node-ca-7rdcc\" (UID: \"8f66e95f-32ea-4c62-b967-18110b01aac3\") " pod="openshift-image-registry/node-ca-7rdcc" Apr 16 18:30:40.499596 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499379 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-modprobe-d\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.499596 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499467 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:40.499596 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-cnibin\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.499596 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499588 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b18d081b-3d3f-48e8-8f52-8eb619b60b77-ovnkube-config\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.499853 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b18d081b-3d3f-48e8-8f52-8eb619b60b77-env-overrides\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.499853 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-registration-dir\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.499957 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499870 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm58m\" (UniqueName: \"kubernetes.io/projected/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-kube-api-access-fm58m\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:40.499957 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499915 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5626j\" (UniqueName: \"kubernetes.io/projected/6e091b04-e5d1-4928-9203-5358e7ad1e2a-kube-api-access-5626j\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.499957 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499947 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-run-k8s-cni-cncf-io\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.500114 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.499977 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-run-netns\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.500114 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500002 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8f66e95f-32ea-4c62-b967-18110b01aac3-serviceca\") pod \"node-ca-7rdcc\" (UID: \"8f66e95f-32ea-4c62-b967-18110b01aac3\") " pod="openshift-image-registry/node-ca-7rdcc" Apr 16 18:30:40.500114 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500025 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-kubernetes\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.500114 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500048 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-tuned\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.500114 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500074 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-slash\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.500114 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.500361 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500145 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-device-dir\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.500361 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500185 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-etc-selinux\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.500361 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500212 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-multus-conf-dir\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.500361 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500234 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-run-systemd\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.500361 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500271 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/85759909-428a-4c11-95e0-96f51d6580f6-agent-certs\") pod \"konnectivity-agent-mw7zk\" (UID: \"85759909-428a-4c11-95e0-96f51d6580f6\") " pod="kube-system/konnectivity-agent-mw7zk" Apr 16 18:30:40.500361 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500297 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-systemd\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.500361 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500322 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-run-multus-certs\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.500361 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500345 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-etc-kubernetes\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.500710 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500368 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-run-ovn-kubernetes\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.500710 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500418 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-run\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.500710 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500450 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-cni-binary-copy\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.500710 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500489 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxvg7\" (UniqueName: \"kubernetes.io/projected/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-kube-api-access-jxvg7\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.500710 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500518 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b86bb118-f0ab-4605-860a-df81a23f9124-hosts-file\") pod \"node-resolver-7jl4x\" (UID: \"b86bb118-f0ab-4605-860a-df81a23f9124\") " pod="openshift-dns/node-resolver-7jl4x" Apr 16 18:30:40.500710 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-host\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.500710 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-sys-fs\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.500710 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500656 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-sys\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.500710 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500688 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-system-cni-dir\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.501273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500717 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-run-netns\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.501273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500812 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-etc-openvswitch\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.501273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-os-release\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.501273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500868 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-run-ovn\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.501273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500895 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f66e95f-32ea-4c62-b967-18110b01aac3-host\") pod \"node-ca-7rdcc\" (UID: \"8f66e95f-32ea-4c62-b967-18110b01aac3\") " pod="openshift-image-registry/node-ca-7rdcc" Apr 16 18:30:40.501273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b18d081b-3d3f-48e8-8f52-8eb619b60b77-ovn-node-metrics-cert\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.501273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.500968 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-socket-dir\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.501273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.501001 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-sysconfig\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.501273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.501035 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-var-lib-kubelet\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.501273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.501058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-multus-daemon-config\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.501273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.501217 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-node-log\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.501273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.501257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-cni-bin\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.501907 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.501281 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b86bb118-f0ab-4605-860a-df81a23f9124-tmp-dir\") pod \"node-resolver-7jl4x\" (UID: \"b86bb118-f0ab-4605-860a-df81a23f9124\") " pod="openshift-dns/node-resolver-7jl4x" Apr 16 18:30:40.501907 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.501312 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-log-socket\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.501907 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.501340 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-cni-netd\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.501907 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.501364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.501907 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.501386 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvhl4\" (UniqueName: \"kubernetes.io/projected/8184d437-e11f-4dc1-a89a-e831d28d24ff-kube-api-access-kvhl4\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.501907 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.501433 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:40.501907 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.501458 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/85759909-428a-4c11-95e0-96f51d6580f6-konnectivity-ca\") pod \"konnectivity-agent-mw7zk\" (UID: \"85759909-428a-4c11-95e0-96f51d6580f6\") " pod="kube-system/konnectivity-agent-mw7zk" Apr 16 18:30:40.501907 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.501508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-sysctl-d\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.529815 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.529778 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:25:39 +0000 UTC" deadline="2028-01-17 09:44:36.698172472 +0000 UTC" Apr 16 18:30:40.529815 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.529812 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15375h13m56.168364871s" Apr 16 18:30:40.589523 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.589499 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:30:40.602155 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-sysctl-conf\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.602301 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602165 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-lib-modules\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.602301 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602191 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-multus-socket-dir-parent\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.602301 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-var-lib-kubelet\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.602301 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-hostroot\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.602301 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602256 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97m9d\" (UniqueName: \"kubernetes.io/projected/b18d081b-3d3f-48e8-8f52-8eb619b60b77-kube-api-access-97m9d\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.602301 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602275 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-var-lib-cni-multus\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.602301 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b18d081b-3d3f-48e8-8f52-8eb619b60b77-ovnkube-script-lib\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.602615 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602320 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-lib-modules\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.602615 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602325 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-var-lib-kubelet\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.602615 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602323 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s4wj\" (UniqueName: \"kubernetes.io/projected/b86bb118-f0ab-4605-860a-df81a23f9124-kube-api-access-5s4wj\") pod \"node-resolver-7jl4x\" (UID: \"b86bb118-f0ab-4605-860a-df81a23f9124\") " pod="openshift-dns/node-resolver-7jl4x" Apr 16 18:30:40.602615 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602370 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-var-lib-cni-multus\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.602615 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602377 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7ecb9d0-5eb7-46c9-b65f-725014636854-cnibin\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.602615 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-multus-socket-dir-parent\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.602615 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6e091b04-e5d1-4928-9203-5358e7ad1e2a-tmp\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.602615 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602563 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-hostroot\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.602615 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-kubelet\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.603035 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602654 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-systemd-units\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.603035 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602656 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-sysctl-conf\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.603035 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602688 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-kubelet\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.603035 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-var-lib-openvswitch\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.603035 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602790 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-var-lib-openvswitch\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.603035 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602806 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-systemd-units\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.603035 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-run-openvswitch\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.603035 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tzzn\" (UniqueName: \"kubernetes.io/projected/8f66e95f-32ea-4c62-b967-18110b01aac3-kube-api-access-9tzzn\") pod \"node-ca-7rdcc\" (UID: \"8f66e95f-32ea-4c62-b967-18110b01aac3\") " pod="openshift-image-registry/node-ca-7rdcc" Apr 16 18:30:40.603035 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.602932 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:30:40.603035 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603008 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f7ecb9d0-5eb7-46c9-b65f-725014636854-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-run-openvswitch\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603045 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6194e8f3-e97c-49da-8ebb-4764a9a77850-host-slash\") pod \"iptables-alerter-mmhj6\" (UID: \"6194e8f3-e97c-49da-8ebb-4764a9a77850\") " pod="openshift-network-operator/iptables-alerter-mmhj6" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-modprobe-d\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-cnibin\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b18d081b-3d3f-48e8-8f52-8eb619b60b77-ovnkube-config\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b18d081b-3d3f-48e8-8f52-8eb619b60b77-env-overrides\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603154 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b18d081b-3d3f-48e8-8f52-8eb619b60b77-ovnkube-script-lib\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-registration-dir\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603183 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-cnibin\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603213 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-modprobe-d\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603220 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fm58m\" (UniqueName: \"kubernetes.io/projected/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-kube-api-access-fm58m\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603239 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-registration-dir\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7ecb9d0-5eb7-46c9-b65f-725014636854-os-release\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4djd\" (UniqueName: \"kubernetes.io/projected/6194e8f3-e97c-49da-8ebb-4764a9a77850-kube-api-access-d4djd\") pod \"iptables-alerter-mmhj6\" (UID: \"6194e8f3-e97c-49da-8ebb-4764a9a77850\") " pod="openshift-network-operator/iptables-alerter-mmhj6" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5626j\" (UniqueName: \"kubernetes.io/projected/6e091b04-e5d1-4928-9203-5358e7ad1e2a-kube-api-access-5626j\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-run-k8s-cni-cncf-io\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.603463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-run-netns\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603402 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8f66e95f-32ea-4c62-b967-18110b01aac3-serviceca\") pod \"node-ca-7rdcc\" (UID: \"8f66e95f-32ea-4c62-b967-18110b01aac3\") " pod="openshift-image-registry/node-ca-7rdcc" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603429 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-kubernetes\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-tuned\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603483 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-slash\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603510 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-device-dir\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603570 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-etc-selinux\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603576 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b18d081b-3d3f-48e8-8f52-8eb619b60b77-env-overrides\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603590 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-kubernetes\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603593 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-run-netns\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603618 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7ecb9d0-5eb7-46c9-b65f-725014636854-tuning-conf-dir\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-run-k8s-cni-cncf-io\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603669 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49hdn\" (UniqueName: \"kubernetes.io/projected/f7ecb9d0-5eb7-46c9-b65f-725014636854-kube-api-access-49hdn\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-multus-conf-dir\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603736 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-etc-selinux\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.604236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603747 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-run-systemd\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603795 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/85759909-428a-4c11-95e0-96f51d6580f6-agent-certs\") pod \"konnectivity-agent-mw7zk\" (UID: \"85759909-428a-4c11-95e0-96f51d6580f6\") " pod="kube-system/konnectivity-agent-mw7zk" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603803 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-device-dir\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603827 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6194e8f3-e97c-49da-8ebb-4764a9a77850-iptables-alerter-script\") pod \"iptables-alerter-mmhj6\" (UID: \"6194e8f3-e97c-49da-8ebb-4764a9a77850\") " pod="openshift-network-operator/iptables-alerter-mmhj6" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-systemd\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603906 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-multus-conf-dir\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-run-multus-certs\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603935 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-etc-kubernetes\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-run-systemd\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.603993 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-run-ovn-kubernetes\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604021 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f7ecb9d0-5eb7-46c9-b65f-725014636854-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604041 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8f66e95f-32ea-4c62-b967-18110b01aac3-serviceca\") pod \"node-ca-7rdcc\" (UID: \"8f66e95f-32ea-4c62-b967-18110b01aac3\") " pod="openshift-image-registry/node-ca-7rdcc" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604045 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-run\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604081 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-run\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-cni-binary-copy\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604127 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-systemd\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-run-multus-certs\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604198 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b18d081b-3d3f-48e8-8f52-8eb619b60b77-ovnkube-config\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.605036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604236 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-etc-kubernetes\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604276 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-slash\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604281 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-run-ovn-kubernetes\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604129 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxvg7\" (UniqueName: \"kubernetes.io/projected/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-kube-api-access-jxvg7\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604335 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b86bb118-f0ab-4605-860a-df81a23f9124-hosts-file\") pod \"node-resolver-7jl4x\" (UID: \"b86bb118-f0ab-4605-860a-df81a23f9124\") " pod="openshift-dns/node-resolver-7jl4x" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-host\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-sys-fs\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7ecb9d0-5eb7-46c9-b65f-725014636854-system-cni-dir\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604434 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b86bb118-f0ab-4605-860a-df81a23f9124-hosts-file\") pod \"node-resolver-7jl4x\" (UID: \"b86bb118-f0ab-4605-860a-df81a23f9124\") " pod="openshift-dns/node-resolver-7jl4x" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-sys\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604435 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-host\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604467 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-system-cni-dir\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-sys-fs\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-run-netns\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604506 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-sys\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-etc-openvswitch\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-system-cni-dir\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-os-release\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.605864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604550 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-run-netns\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604570 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-run-ovn\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604590 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-os-release\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604589 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-etc-openvswitch\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604601 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-cni-binary-copy\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-run-ovn\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f66e95f-32ea-4c62-b967-18110b01aac3-host\") pod \"node-ca-7rdcc\" (UID: \"8f66e95f-32ea-4c62-b967-18110b01aac3\") " pod="openshift-image-registry/node-ca-7rdcc" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f66e95f-32ea-4c62-b967-18110b01aac3-host\") pod \"node-ca-7rdcc\" (UID: \"8f66e95f-32ea-4c62-b967-18110b01aac3\") " pod="openshift-image-registry/node-ca-7rdcc" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604664 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b18d081b-3d3f-48e8-8f52-8eb619b60b77-ovn-node-metrics-cert\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604690 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-socket-dir\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-sysconfig\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-var-lib-kubelet\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-multus-daemon-config\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604813 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-node-log\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-cni-bin\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604846 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-sysconfig\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b86bb118-f0ab-4605-860a-df81a23f9124-tmp-dir\") pod \"node-resolver-7jl4x\" (UID: \"b86bb118-f0ab-4605-860a-df81a23f9124\") " pod="openshift-dns/node-resolver-7jl4x" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604888 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-log-socket\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.606697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604917 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-cni-netd\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604954 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-socket-dir\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvhl4\" (UniqueName: \"kubernetes.io/projected/8184d437-e11f-4dc1-a89a-e831d28d24ff-kube-api-access-kvhl4\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604994 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-cni-bin\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.604996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605029 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/85759909-428a-4c11-95e0-96f51d6580f6-konnectivity-ca\") pod \"konnectivity-agent-mw7zk\" (UID: \"85759909-428a-4c11-95e0-96f51d6580f6\") " pod="kube-system/konnectivity-agent-mw7zk" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605032 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-var-lib-kubelet\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605060 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76zw2\" (UniqueName: \"kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2\") pod \"network-check-target-57qhk\" (UID: \"47c264de-a221-4aa7-8732-5a2e31ec7974\") " pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-sysctl-d\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605112 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-multus-cni-dir\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-var-lib-cni-bin\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8184d437-e11f-4dc1-a89a-e831d28d24ff-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7ecb9d0-5eb7-46c9-b65f-725014636854-cni-binary-copy\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-host-var-lib-cni-bin\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605319 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-host-cni-netd\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605326 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-node-log\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605416 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-multus-daemon-config\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.607550 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:40.605446 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:40.608259 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605423 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-multus-cni-dir\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.608259 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:40.605537 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs podName:70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:41.105499284 +0000 UTC m=+3.050929977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs") pod "network-metrics-daemon-tldk9" (UID: "70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:40.608259 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605547 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-sysctl-d\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.608259 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605606 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b18d081b-3d3f-48e8-8f52-8eb619b60b77-log-socket\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.608259 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605621 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b86bb118-f0ab-4605-860a-df81a23f9124-tmp-dir\") pod \"node-resolver-7jl4x\" (UID: \"b86bb118-f0ab-4605-860a-df81a23f9124\") " pod="openshift-dns/node-resolver-7jl4x" Apr 16 18:30:40.608259 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.605982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/85759909-428a-4c11-95e0-96f51d6580f6-konnectivity-ca\") pod \"konnectivity-agent-mw7zk\" (UID: \"85759909-428a-4c11-95e0-96f51d6580f6\") " pod="kube-system/konnectivity-agent-mw7zk" Apr 16 18:30:40.608259 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.606370 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6e091b04-e5d1-4928-9203-5358e7ad1e2a-tmp\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.608259 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.606648 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/85759909-428a-4c11-95e0-96f51d6580f6-agent-certs\") pod \"konnectivity-agent-mw7zk\" (UID: \"85759909-428a-4c11-95e0-96f51d6580f6\") " pod="kube-system/konnectivity-agent-mw7zk" Apr 16 18:30:40.608259 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.606758 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6e091b04-e5d1-4928-9203-5358e7ad1e2a-etc-tuned\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.608259 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.607804 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b18d081b-3d3f-48e8-8f52-8eb619b60b77-ovn-node-metrics-cert\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.614674 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.614647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvhl4\" (UniqueName: \"kubernetes.io/projected/8184d437-e11f-4dc1-a89a-e831d28d24ff-kube-api-access-kvhl4\") pod \"aws-ebs-csi-driver-node-7xmpp\" (UID: \"8184d437-e11f-4dc1-a89a-e831d28d24ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.617844 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.615207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5626j\" (UniqueName: \"kubernetes.io/projected/6e091b04-e5d1-4928-9203-5358e7ad1e2a-kube-api-access-5626j\") pod \"tuned-lgzq4\" (UID: \"6e091b04-e5d1-4928-9203-5358e7ad1e2a\") " pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.617844 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.615583 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97m9d\" (UniqueName: \"kubernetes.io/projected/b18d081b-3d3f-48e8-8f52-8eb619b60b77-kube-api-access-97m9d\") pod \"ovnkube-node-dsp8f\" (UID: \"b18d081b-3d3f-48e8-8f52-8eb619b60b77\") " pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.617844 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.615615 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxvg7\" (UniqueName: \"kubernetes.io/projected/c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3-kube-api-access-jxvg7\") pod \"multus-9fkg9\" (UID: \"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3\") " pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.617844 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.615628 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm58m\" (UniqueName: \"kubernetes.io/projected/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-kube-api-access-fm58m\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:40.617844 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.616036 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tzzn\" (UniqueName: \"kubernetes.io/projected/8f66e95f-32ea-4c62-b967-18110b01aac3-kube-api-access-9tzzn\") pod \"node-ca-7rdcc\" (UID: \"8f66e95f-32ea-4c62-b967-18110b01aac3\") " pod="openshift-image-registry/node-ca-7rdcc" Apr 16 18:30:40.618710 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.618666 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s4wj\" (UniqueName: \"kubernetes.io/projected/b86bb118-f0ab-4605-860a-df81a23f9124-kube-api-access-5s4wj\") pod \"node-resolver-7jl4x\" (UID: \"b86bb118-f0ab-4605-860a-df81a23f9124\") " pod="openshift-dns/node-resolver-7jl4x" Apr 16 18:30:40.706011 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.705922 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76zw2\" (UniqueName: \"kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2\") pod \"network-check-target-57qhk\" (UID: \"47c264de-a221-4aa7-8732-5a2e31ec7974\") " pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:40.706011 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.705960 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7ecb9d0-5eb7-46c9-b65f-725014636854-cni-binary-copy\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.706011 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.705993 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7ecb9d0-5eb7-46c9-b65f-725014636854-cnibin\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.706250 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f7ecb9d0-5eb7-46c9-b65f-725014636854-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.706250 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6194e8f3-e97c-49da-8ebb-4764a9a77850-host-slash\") pod \"iptables-alerter-mmhj6\" (UID: \"6194e8f3-e97c-49da-8ebb-4764a9a77850\") " pod="openshift-network-operator/iptables-alerter-mmhj6" Apr 16 18:30:40.706250 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706108 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7ecb9d0-5eb7-46c9-b65f-725014636854-cnibin\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.706250 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7ecb9d0-5eb7-46c9-b65f-725014636854-os-release\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.706250 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706158 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4djd\" (UniqueName: \"kubernetes.io/projected/6194e8f3-e97c-49da-8ebb-4764a9a77850-kube-api-access-d4djd\") pod \"iptables-alerter-mmhj6\" (UID: \"6194e8f3-e97c-49da-8ebb-4764a9a77850\") " pod="openshift-network-operator/iptables-alerter-mmhj6" Apr 16 18:30:40.706250 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7ecb9d0-5eb7-46c9-b65f-725014636854-tuning-conf-dir\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.706250 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6194e8f3-e97c-49da-8ebb-4764a9a77850-host-slash\") pod \"iptables-alerter-mmhj6\" (UID: \"6194e8f3-e97c-49da-8ebb-4764a9a77850\") " pod="openshift-network-operator/iptables-alerter-mmhj6" Apr 16 18:30:40.706250 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49hdn\" (UniqueName: \"kubernetes.io/projected/f7ecb9d0-5eb7-46c9-b65f-725014636854-kube-api-access-49hdn\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.706250 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7ecb9d0-5eb7-46c9-b65f-725014636854-os-release\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.706250 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706250 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6194e8f3-e97c-49da-8ebb-4764a9a77850-iptables-alerter-script\") pod \"iptables-alerter-mmhj6\" (UID: \"6194e8f3-e97c-49da-8ebb-4764a9a77850\") " pod="openshift-network-operator/iptables-alerter-mmhj6" Apr 16 18:30:40.706783 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f7ecb9d0-5eb7-46c9-b65f-725014636854-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.706783 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7ecb9d0-5eb7-46c9-b65f-725014636854-system-cni-dir\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.706783 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7ecb9d0-5eb7-46c9-b65f-725014636854-system-cni-dir\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.706783 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706655 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7ecb9d0-5eb7-46c9-b65f-725014636854-cni-binary-copy\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.706969 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706840 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6194e8f3-e97c-49da-8ebb-4764a9a77850-iptables-alerter-script\") pod \"iptables-alerter-mmhj6\" (UID: \"6194e8f3-e97c-49da-8ebb-4764a9a77850\") " pod="openshift-network-operator/iptables-alerter-mmhj6" Apr 16 18:30:40.706969 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706867 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7ecb9d0-5eb7-46c9-b65f-725014636854-tuning-conf-dir\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.706969 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.706840 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f7ecb9d0-5eb7-46c9-b65f-725014636854-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.707077 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.707055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f7ecb9d0-5eb7-46c9-b65f-725014636854-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.711825 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:40.711803 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:40.711825 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:40.711826 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:40.712005 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:40.711841 2577 projected.go:194] Error preparing data for projected volume kube-api-access-76zw2 for pod openshift-network-diagnostics/network-check-target-57qhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:40.712005 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:40.711915 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2 podName:47c264de-a221-4aa7-8732-5a2e31ec7974 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:41.211897398 +0000 UTC m=+3.157328105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-76zw2" (UniqueName: "kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2") pod "network-check-target-57qhk" (UID: "47c264de-a221-4aa7-8732-5a2e31ec7974") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:40.713902 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.713883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49hdn\" (UniqueName: \"kubernetes.io/projected/f7ecb9d0-5eb7-46c9-b65f-725014636854-kube-api-access-49hdn\") pod \"multus-additional-cni-plugins-49rzx\" (UID: \"f7ecb9d0-5eb7-46c9-b65f-725014636854\") " pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.713982 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.713929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4djd\" (UniqueName: \"kubernetes.io/projected/6194e8f3-e97c-49da-8ebb-4764a9a77850-kube-api-access-d4djd\") pod \"iptables-alerter-mmhj6\" (UID: \"6194e8f3-e97c-49da-8ebb-4764a9a77850\") " pod="openshift-network-operator/iptables-alerter-mmhj6" Apr 16 18:30:40.796931 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.796892 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" Apr 16 18:30:40.805581 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.805558 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7rdcc" Apr 16 18:30:40.814083 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.814057 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9fkg9" Apr 16 18:30:40.820756 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.820731 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:30:40.828337 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.828319 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" Apr 16 18:30:40.835850 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.835834 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mw7zk" Apr 16 18:30:40.842392 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.842373 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7jl4x" Apr 16 18:30:40.848913 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.848896 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-49rzx" Apr 16 18:30:40.856421 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:40.856401 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mmhj6" Apr 16 18:30:41.109860 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.109827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:41.110037 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:41.109949 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:41.110037 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:41.110033 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs podName:70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:42.110011929 +0000 UTC m=+4.055442623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs") pod "network-metrics-daemon-tldk9" (UID: "70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:41.232916 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:41.232897 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4d0bf99_a3c2_47cc_acdf_4ffec50d8ba3.slice/crio-1b9af476b9cf10de4d668560ad6d53013c33821f2f51d11ad17c8fb4f1580196 WatchSource:0}: Error finding container 1b9af476b9cf10de4d668560ad6d53013c33821f2f51d11ad17c8fb4f1580196: Status 404 returned error can't find the container with id 1b9af476b9cf10de4d668560ad6d53013c33821f2f51d11ad17c8fb4f1580196 Apr 16 18:30:41.235903 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:41.235878 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8184d437_e11f_4dc1_a89a_e831d28d24ff.slice/crio-bceed74e2407c5b498fa17e65bae2a7ec855ff942e5e38d85ab3d7132d469865 WatchSource:0}: Error finding container bceed74e2407c5b498fa17e65bae2a7ec855ff942e5e38d85ab3d7132d469865: Status 404 returned error can't find the container with id bceed74e2407c5b498fa17e65bae2a7ec855ff942e5e38d85ab3d7132d469865 Apr 16 18:30:41.237174 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:41.237149 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7ecb9d0_5eb7_46c9_b65f_725014636854.slice/crio-c4630a0ee455cae2d54d4ec0798356210186c25d7f90715b12aa8428e6ebdb09 WatchSource:0}: Error finding container c4630a0ee455cae2d54d4ec0798356210186c25d7f90715b12aa8428e6ebdb09: Status 404 returned error can't find the container with id c4630a0ee455cae2d54d4ec0798356210186c25d7f90715b12aa8428e6ebdb09 Apr 16 18:30:41.238069 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:41.238018 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85759909_428a_4c11_95e0_96f51d6580f6.slice/crio-5e97a193ac3245dbfc523763f93075f5f0727d84ed7205810c7bc8b788aa7465 WatchSource:0}: Error finding container 5e97a193ac3245dbfc523763f93075f5f0727d84ed7205810c7bc8b788aa7465: Status 404 returned error can't find the container with id 5e97a193ac3245dbfc523763f93075f5f0727d84ed7205810c7bc8b788aa7465 Apr 16 18:30:41.238943 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:41.238741 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb86bb118_f0ab_4605_860a_df81a23f9124.slice/crio-53e93366afc2dda53df7e00ec0a48f4ec43be59a7cf1592b9304d39b686c2c44 WatchSource:0}: Error finding container 53e93366afc2dda53df7e00ec0a48f4ec43be59a7cf1592b9304d39b686c2c44: Status 404 returned error can't find the container with id 53e93366afc2dda53df7e00ec0a48f4ec43be59a7cf1592b9304d39b686c2c44 Apr 16 18:30:41.240225 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:41.239999 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e091b04_e5d1_4928_9203_5358e7ad1e2a.slice/crio-5cc593140cd40f5d25cbeb75f2b1f54f339507c264b374c1d8016fc7fc4e0552 WatchSource:0}: Error finding container 5cc593140cd40f5d25cbeb75f2b1f54f339507c264b374c1d8016fc7fc4e0552: Status 404 returned error can't find the container with id 5cc593140cd40f5d25cbeb75f2b1f54f339507c264b374c1d8016fc7fc4e0552 Apr 16 18:30:41.240864 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:41.240838 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6194e8f3_e97c_49da_8ebb_4764a9a77850.slice/crio-a18d33cc6c210f5a1c95a2dccff42fcf9777b5dd8aa082820f297d0a3acca229 WatchSource:0}: Error finding container a18d33cc6c210f5a1c95a2dccff42fcf9777b5dd8aa082820f297d0a3acca229: Status 404 returned error can't find the container with id a18d33cc6c210f5a1c95a2dccff42fcf9777b5dd8aa082820f297d0a3acca229 Apr 16 18:30:41.243063 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:41.241791 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f66e95f_32ea_4c62_b967_18110b01aac3.slice/crio-067eded90c190c29a63f4cb32b09804e9afcebfbd4354147131bbe1670e51b2d WatchSource:0}: Error finding container 067eded90c190c29a63f4cb32b09804e9afcebfbd4354147131bbe1670e51b2d: Status 404 returned error can't find the container with id 067eded90c190c29a63f4cb32b09804e9afcebfbd4354147131bbe1670e51b2d Apr 16 18:30:41.243063 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:30:41.242572 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb18d081b_3d3f_48e8_8f52_8eb619b60b77.slice/crio-580668c9d86b4710c64974a493160ee233ec3a8f48a05c863714ed282bc270d0 WatchSource:0}: Error finding container 580668c9d86b4710c64974a493160ee233ec3a8f48a05c863714ed282bc270d0: Status 404 returned error can't find the container with id 580668c9d86b4710c64974a493160ee233ec3a8f48a05c863714ed282bc270d0 Apr 16 18:30:41.311561 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.311537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76zw2\" (UniqueName: \"kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2\") pod \"network-check-target-57qhk\" (UID: \"47c264de-a221-4aa7-8732-5a2e31ec7974\") " pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:41.311657 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:41.311647 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:41.311698 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:41.311661 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:41.311698 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:41.311670 2577 projected.go:194] Error preparing data for projected volume kube-api-access-76zw2 for pod openshift-network-diagnostics/network-check-target-57qhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:41.311814 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:41.311715 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2 podName:47c264de-a221-4aa7-8732-5a2e31ec7974 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:42.311700857 +0000 UTC m=+4.257131551 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-76zw2" (UniqueName: "kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2") pod "network-check-target-57qhk" (UID: "47c264de-a221-4aa7-8732-5a2e31ec7974") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:41.529999 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.529901 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:25:39 +0000 UTC" deadline="2027-11-12 08:38:22.827855687 +0000 UTC" Apr 16 18:30:41.529999 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.529929 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13790h7m41.297929253s" Apr 16 18:30:41.629701 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.628939 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49rzx" event={"ID":"f7ecb9d0-5eb7-46c9-b65f-725014636854","Type":"ContainerStarted","Data":"c4630a0ee455cae2d54d4ec0798356210186c25d7f90715b12aa8428e6ebdb09"} Apr 16 18:30:41.631326 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.631295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" event={"ID":"8184d437-e11f-4dc1-a89a-e831d28d24ff","Type":"ContainerStarted","Data":"bceed74e2407c5b498fa17e65bae2a7ec855ff942e5e38d85ab3d7132d469865"} Apr 16 18:30:41.633641 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.633613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" event={"ID":"6e091b04-e5d1-4928-9203-5358e7ad1e2a","Type":"ContainerStarted","Data":"5cc593140cd40f5d25cbeb75f2b1f54f339507c264b374c1d8016fc7fc4e0552"} Apr 16 18:30:41.635155 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.635083 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7jl4x" event={"ID":"b86bb118-f0ab-4605-860a-df81a23f9124","Type":"ContainerStarted","Data":"53e93366afc2dda53df7e00ec0a48f4ec43be59a7cf1592b9304d39b686c2c44"} Apr 16 18:30:41.641568 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.641541 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9fkg9" event={"ID":"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3","Type":"ContainerStarted","Data":"1b9af476b9cf10de4d668560ad6d53013c33821f2f51d11ad17c8fb4f1580196"} Apr 16 18:30:41.648266 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.648237 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-1.ec2.internal" event={"ID":"d652678662059d52536902d6dffe6ef4","Type":"ContainerStarted","Data":"b893164c7b0c2f8b04fe531b48233b6556ee3290eb74d0f850f5699035199700"} Apr 16 18:30:41.650422 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.650370 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" event={"ID":"b18d081b-3d3f-48e8-8f52-8eb619b60b77","Type":"ContainerStarted","Data":"580668c9d86b4710c64974a493160ee233ec3a8f48a05c863714ed282bc270d0"} Apr 16 18:30:41.652079 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.652042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7rdcc" event={"ID":"8f66e95f-32ea-4c62-b967-18110b01aac3","Type":"ContainerStarted","Data":"067eded90c190c29a63f4cb32b09804e9afcebfbd4354147131bbe1670e51b2d"} Apr 16 18:30:41.657296 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.657251 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mmhj6" event={"ID":"6194e8f3-e97c-49da-8ebb-4764a9a77850","Type":"ContainerStarted","Data":"a18d33cc6c210f5a1c95a2dccff42fcf9777b5dd8aa082820f297d0a3acca229"} Apr 16 18:30:41.667652 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.666723 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-1.ec2.internal" podStartSLOduration=1.666708466 podStartE2EDuration="1.666708466s" podCreationTimestamp="2026-04-16 18:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:41.666103544 +0000 UTC m=+3.611534277" watchObservedRunningTime="2026-04-16 18:30:41.666708466 +0000 UTC m=+3.612139180" Apr 16 18:30:41.668550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:41.668278 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mw7zk" event={"ID":"85759909-428a-4c11-95e0-96f51d6580f6","Type":"ContainerStarted","Data":"5e97a193ac3245dbfc523763f93075f5f0727d84ed7205810c7bc8b788aa7465"} Apr 16 18:30:42.121528 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:42.120955 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:42.121528 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:42.121123 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:42.121528 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:42.121184 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs podName:70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:44.12116539 +0000 UTC m=+6.066596081 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs") pod "network-metrics-daemon-tldk9" (UID: "70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:42.323925 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:42.323300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76zw2\" (UniqueName: \"kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2\") pod \"network-check-target-57qhk\" (UID: \"47c264de-a221-4aa7-8732-5a2e31ec7974\") " pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:42.323925 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:42.323462 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:42.323925 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:42.323484 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:42.323925 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:42.323498 2577 projected.go:194] Error preparing data for projected volume kube-api-access-76zw2 for pod openshift-network-diagnostics/network-check-target-57qhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:42.323925 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:42.323558 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2 podName:47c264de-a221-4aa7-8732-5a2e31ec7974 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:44.32354008 +0000 UTC m=+6.268970775 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-76zw2" (UniqueName: "kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2") pod "network-check-target-57qhk" (UID: "47c264de-a221-4aa7-8732-5a2e31ec7974") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:42.614555 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:42.613975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:42.614555 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:42.614112 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:30:42.617412 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:42.615140 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:42.617412 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:42.615238 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:30:42.684476 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:42.683285 2577 generic.go:358] "Generic (PLEG): container finished" podID="88424cbdbf8cac342ecdd775f1456f96" containerID="06419aa0daa67a6b2c770f6214e02fadca4a9246cc6be2dab2741a5b9fe56fb9" exitCode=0 Apr 16 18:30:42.684476 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:42.684221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal" event={"ID":"88424cbdbf8cac342ecdd775f1456f96","Type":"ContainerDied","Data":"06419aa0daa67a6b2c770f6214e02fadca4a9246cc6be2dab2741a5b9fe56fb9"} Apr 16 18:30:43.706334 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:43.706297 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal" event={"ID":"88424cbdbf8cac342ecdd775f1456f96","Type":"ContainerStarted","Data":"f211633064e7937e80bcdd2de3a834af0beafb0a7fc37cd3d79afd63f0df358d"} Apr 16 18:30:43.722693 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:43.722636 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-1.ec2.internal" podStartSLOduration=4.722617704 podStartE2EDuration="4.722617704s" podCreationTimestamp="2026-04-16 18:30:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:43.722016568 +0000 UTC m=+5.667447277" watchObservedRunningTime="2026-04-16 18:30:43.722617704 +0000 UTC m=+5.668048431" Apr 16 18:30:44.140238 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:44.140196 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:44.140416 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:44.140382 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:44.140472 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:44.140447 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs podName:70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:48.140429078 +0000 UTC m=+10.085859774 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs") pod "network-metrics-daemon-tldk9" (UID: "70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:44.341784 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:44.341727 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76zw2\" (UniqueName: \"kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2\") pod \"network-check-target-57qhk\" (UID: \"47c264de-a221-4aa7-8732-5a2e31ec7974\") " pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:44.341970 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:44.341916 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:44.341970 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:44.341941 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:44.341970 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:44.341954 2577 projected.go:194] Error preparing data for projected volume kube-api-access-76zw2 for pod openshift-network-diagnostics/network-check-target-57qhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:44.342145 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:44.342015 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2 podName:47c264de-a221-4aa7-8732-5a2e31ec7974 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:48.341996358 +0000 UTC m=+10.287427045 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-76zw2" (UniqueName: "kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2") pod "network-check-target-57qhk" (UID: "47c264de-a221-4aa7-8732-5a2e31ec7974") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:44.614128 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:44.614080 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:44.614308 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:44.614222 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:30:44.614389 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:44.614365 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:44.614500 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:44.614477 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:30:46.614920 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:46.614245 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:46.614920 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:46.614368 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:30:46.614920 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:46.614781 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:46.614920 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:46.614883 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:30:48.175938 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:48.175904 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:48.176368 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:48.176047 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:48.176368 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:48.176105 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs podName:70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:56.176088675 +0000 UTC m=+18.121519363 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs") pod "network-metrics-daemon-tldk9" (UID: "70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:48.377358 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:48.377314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76zw2\" (UniqueName: \"kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2\") pod \"network-check-target-57qhk\" (UID: \"47c264de-a221-4aa7-8732-5a2e31ec7974\") " pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:48.377520 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:48.377481 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:48.377520 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:48.377500 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:48.377520 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:48.377513 2577 projected.go:194] Error preparing data for projected volume kube-api-access-76zw2 for pod openshift-network-diagnostics/network-check-target-57qhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:48.377753 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:48.377576 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2 podName:47c264de-a221-4aa7-8732-5a2e31ec7974 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:56.377555992 +0000 UTC m=+18.322986685 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-76zw2" (UniqueName: "kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2") pod "network-check-target-57qhk" (UID: "47c264de-a221-4aa7-8732-5a2e31ec7974") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:48.617518 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:48.617489 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:48.617693 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:48.617611 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:30:48.617898 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:48.617865 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:48.618023 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:48.617985 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:30:50.613921 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:50.613885 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:50.614383 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:50.613904 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:50.614383 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:50.614027 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:30:50.614383 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:50.614088 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:30:52.613748 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:52.613716 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:52.614266 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:52.613777 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:52.614266 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:52.613877 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:30:52.614266 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:52.613973 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:30:54.613599 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:54.613563 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:54.614089 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:54.613617 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:54.614089 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:54.613704 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:30:54.614089 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:54.613841 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:30:56.234865 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:56.234827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:56.235317 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:56.234996 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:56.235317 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:56.235063 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs podName:70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:12.235047068 +0000 UTC m=+34.180477761 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs") pod "network-metrics-daemon-tldk9" (UID: "70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:56.436883 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:56.436845 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76zw2\" (UniqueName: \"kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2\") pod \"network-check-target-57qhk\" (UID: \"47c264de-a221-4aa7-8732-5a2e31ec7974\") " pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:56.437038 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:56.436985 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:56.437038 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:56.437002 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:56.437038 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:56.437012 2577 projected.go:194] Error preparing data for projected volume kube-api-access-76zw2 for pod openshift-network-diagnostics/network-check-target-57qhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:56.437146 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:56.437064 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2 podName:47c264de-a221-4aa7-8732-5a2e31ec7974 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:12.437049689 +0000 UTC m=+34.382480378 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-76zw2" (UniqueName: "kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2") pod "network-check-target-57qhk" (UID: "47c264de-a221-4aa7-8732-5a2e31ec7974") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:56.613875 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:56.613834 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:56.613875 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:56.613863 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:56.614109 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:56.613984 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:30:56.614202 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:56.614167 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:30:58.614480 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.614273 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:30:58.615207 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.614337 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:30:58.615207 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:58.614568 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:30:58.615207 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:30:58.614638 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:30:58.735349 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.735174 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:30:58.735711 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.735686 2577 generic.go:358] "Generic (PLEG): container finished" podID="b18d081b-3d3f-48e8-8f52-8eb619b60b77" containerID="b95410f8fd761bb372d8b876a977990959b74413748769408450002424de64f3" exitCode=1 Apr 16 18:30:58.735807 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.735762 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" event={"ID":"b18d081b-3d3f-48e8-8f52-8eb619b60b77","Type":"ContainerStarted","Data":"0b8ca318f2b5fd9ec8487ba81003a46bce8c6463dc4e926b68dea535ebabaa80"} Apr 16 18:30:58.735807 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.735803 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" event={"ID":"b18d081b-3d3f-48e8-8f52-8eb619b60b77","Type":"ContainerDied","Data":"b95410f8fd761bb372d8b876a977990959b74413748769408450002424de64f3"} Apr 16 18:30:58.735878 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.735819 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" event={"ID":"b18d081b-3d3f-48e8-8f52-8eb619b60b77","Type":"ContainerStarted","Data":"810af9e08280e6db122bfe422df4ea1f546677311bb8878e8996ea6b9cafdd54"} Apr 16 18:30:58.736998 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.736968 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7rdcc" event={"ID":"8f66e95f-32ea-4c62-b967-18110b01aac3","Type":"ContainerStarted","Data":"44d4e26c715a1afd03972d906b6e12a947c93f7f5509259215115b925899c1ba"} Apr 16 18:30:58.738295 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.738275 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mw7zk" event={"ID":"85759909-428a-4c11-95e0-96f51d6580f6","Type":"ContainerStarted","Data":"fb7ebadcacf58f6b4485c439f0c96e8c266bb58d225db4654674b1cac67b8491"} Apr 16 18:30:58.741116 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.740154 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7ecb9d0-5eb7-46c9-b65f-725014636854" containerID="4ad01333edf0fff2854f2df81d58f7a7f2d4aadb0f0218b684ecb79d242c402e" exitCode=0 Apr 16 18:30:58.741116 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.740217 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49rzx" event={"ID":"f7ecb9d0-5eb7-46c9-b65f-725014636854","Type":"ContainerDied","Data":"4ad01333edf0fff2854f2df81d58f7a7f2d4aadb0f0218b684ecb79d242c402e"} Apr 16 18:30:58.743170 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.743129 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" event={"ID":"8184d437-e11f-4dc1-a89a-e831d28d24ff","Type":"ContainerStarted","Data":"8bcea4471e80c46cfc7937d909ae5b54044bbd3bbcfcac0407242b7bc07b5029"} Apr 16 18:30:58.744413 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.744394 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" event={"ID":"6e091b04-e5d1-4928-9203-5358e7ad1e2a","Type":"ContainerStarted","Data":"f062288c8accc2880be9d37b4b9ad36abfbf260dd5706ee3fe75b3314067a086"} Apr 16 18:30:58.745550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.745531 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7jl4x" event={"ID":"b86bb118-f0ab-4605-860a-df81a23f9124","Type":"ContainerStarted","Data":"766b3188b7c181302218aacd2e63000d959c1a74b11bdee53c063909a367422b"} Apr 16 18:30:58.746655 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.746637 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9fkg9" event={"ID":"c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3","Type":"ContainerStarted","Data":"efd995cbc13239e504bbca34bef5873edbbd26721e8068dcbd8207177b1f7803"} Apr 16 18:30:58.752646 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.752612 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7rdcc" podStartSLOduration=4.063903711 podStartE2EDuration="20.752602267s" podCreationTimestamp="2026-04-16 18:30:38 +0000 UTC" firstStartedPulling="2026-04-16 18:30:41.2447163 +0000 UTC m=+3.190146990" lastFinishedPulling="2026-04-16 18:30:57.933414841 +0000 UTC m=+19.878845546" observedRunningTime="2026-04-16 18:30:58.752335839 +0000 UTC m=+20.697766549" watchObservedRunningTime="2026-04-16 18:30:58.752602267 +0000 UTC m=+20.698032977" Apr 16 18:30:58.768804 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.768748 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lgzq4" podStartSLOduration=4.103736895 podStartE2EDuration="20.768737394s" podCreationTimestamp="2026-04-16 18:30:38 +0000 UTC" firstStartedPulling="2026-04-16 18:30:41.242063969 +0000 UTC m=+3.187494671" lastFinishedPulling="2026-04-16 18:30:57.907064464 +0000 UTC m=+19.852495170" observedRunningTime="2026-04-16 18:30:58.768515653 +0000 UTC m=+20.713946363" watchObservedRunningTime="2026-04-16 18:30:58.768737394 +0000 UTC m=+20.714168104" Apr 16 18:30:58.808267 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.808228 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9fkg9" podStartSLOduration=4.047417737 podStartE2EDuration="20.808215387s" podCreationTimestamp="2026-04-16 18:30:38 +0000 UTC" firstStartedPulling="2026-04-16 18:30:41.234809839 +0000 UTC m=+3.180240527" lastFinishedPulling="2026-04-16 18:30:57.995607472 +0000 UTC m=+19.941038177" observedRunningTime="2026-04-16 18:30:58.787577326 +0000 UTC m=+20.733008048" watchObservedRunningTime="2026-04-16 18:30:58.808215387 +0000 UTC m=+20.753646097" Apr 16 18:30:58.829482 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:58.829445 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mw7zk" podStartSLOduration=4.1622636 podStartE2EDuration="20.829432761s" podCreationTimestamp="2026-04-16 18:30:38 +0000 UTC" firstStartedPulling="2026-04-16 18:30:41.240050456 +0000 UTC m=+3.185481145" lastFinishedPulling="2026-04-16 18:30:57.907219615 +0000 UTC m=+19.852650306" observedRunningTime="2026-04-16 18:30:58.808166494 +0000 UTC m=+20.753597214" watchObservedRunningTime="2026-04-16 18:30:58.829432761 +0000 UTC m=+20.774863472" Apr 16 18:30:59.659390 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:59.659183 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:30:59.751394 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:59.751364 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:30:59.751779 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:59.751742 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" event={"ID":"b18d081b-3d3f-48e8-8f52-8eb619b60b77","Type":"ContainerStarted","Data":"57dc524081200bd44b123fdf91dc07ea27b66d52f9212ad1fb12841eca37444d"} Apr 16 18:30:59.751893 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:59.751794 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" event={"ID":"b18d081b-3d3f-48e8-8f52-8eb619b60b77","Type":"ContainerStarted","Data":"025a8c65a765ac0b1f6ae13a9cc3dbf063e63a632a39ad467dc02ee33d60298f"} Apr 16 18:30:59.751893 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:59.751809 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" event={"ID":"b18d081b-3d3f-48e8-8f52-8eb619b60b77","Type":"ContainerStarted","Data":"3bbf70e562b29231e4bcddafa42a65d8e0d7963b9eea6cff2f46a4ddfcd97040"} Apr 16 18:30:59.753313 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:59.753280 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mmhj6" event={"ID":"6194e8f3-e97c-49da-8ebb-4764a9a77850","Type":"ContainerStarted","Data":"d729ff720e7cf1718a481322b2ef4a32fc3b72607f2510d3c35888185da4fbfa"} Apr 16 18:30:59.755180 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:59.755150 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" event={"ID":"8184d437-e11f-4dc1-a89a-e831d28d24ff","Type":"ContainerStarted","Data":"4b443c04d4a90659ba22cd82e40233fd4b059fb0686c276541b1f9fbd9d1e59d"} Apr 16 18:30:59.767416 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:59.767376 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mmhj6" podStartSLOduration=4.070917421 podStartE2EDuration="20.767365009s" podCreationTimestamp="2026-04-16 18:30:39 +0000 UTC" firstStartedPulling="2026-04-16 18:30:41.243421864 +0000 UTC m=+3.188852567" lastFinishedPulling="2026-04-16 18:30:57.939869418 +0000 UTC m=+19.885300155" observedRunningTime="2026-04-16 18:30:59.767311121 +0000 UTC m=+21.712741844" watchObservedRunningTime="2026-04-16 18:30:59.767365009 +0000 UTC m=+21.712795718" Apr 16 18:30:59.767689 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:30:59.767660 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7jl4x" podStartSLOduration=5.076694112 podStartE2EDuration="21.767653816s" podCreationTimestamp="2026-04-16 18:30:38 +0000 UTC" firstStartedPulling="2026-04-16 18:30:41.242447577 +0000 UTC m=+3.187878270" lastFinishedPulling="2026-04-16 18:30:57.933407279 +0000 UTC m=+19.878837974" observedRunningTime="2026-04-16 18:30:58.846556295 +0000 UTC m=+20.791987005" watchObservedRunningTime="2026-04-16 18:30:59.767653816 +0000 UTC m=+21.713084577" Apr 16 18:31:00.566576 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:00.566467 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:30:59.659383719Z","UUID":"5f00dbd0-211a-4ea0-ab3c-57ce006481bd","Handler":null,"Name":"","Endpoint":""} Apr 16 18:31:00.570007 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:00.569981 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:31:00.570007 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:00.570015 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:31:00.614057 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:00.613928 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:31:00.614057 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:00.613935 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:31:00.614285 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:00.614064 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:31:00.614285 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:00.614120 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:31:01.370324 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:01.370243 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mw7zk" Apr 16 18:31:01.761941 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:01.761864 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:31:01.762256 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:01.762223 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" event={"ID":"b18d081b-3d3f-48e8-8f52-8eb619b60b77","Type":"ContainerStarted","Data":"6e8fea3c7db952fcb92047d1f660806eda6cdc5112aa7f12548d636d82dd7e59"} Apr 16 18:31:01.764096 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:01.764070 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" event={"ID":"8184d437-e11f-4dc1-a89a-e831d28d24ff","Type":"ContainerStarted","Data":"06eee602c27935a797f3d314bd1ce83b288fa333f63f430fa38c8b428a547d12"} Apr 16 18:31:01.781364 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:01.781327 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7xmpp" podStartSLOduration=4.161868242 podStartE2EDuration="23.781314649s" podCreationTimestamp="2026-04-16 18:30:38 +0000 UTC" firstStartedPulling="2026-04-16 18:30:41.238280711 +0000 UTC m=+3.183711418" lastFinishedPulling="2026-04-16 18:31:00.857727124 +0000 UTC m=+22.803157825" observedRunningTime="2026-04-16 18:31:01.781168144 +0000 UTC m=+23.726598855" watchObservedRunningTime="2026-04-16 18:31:01.781314649 +0000 UTC m=+23.726745361" Apr 16 18:31:02.167239 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:02.167205 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mw7zk" Apr 16 18:31:02.167954 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:02.167929 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mw7zk" Apr 16 18:31:02.613427 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:02.613398 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:31:02.613427 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:02.613420 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:31:02.614048 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:02.613514 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:31:02.614048 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:02.613582 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:31:02.766893 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:02.766864 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mw7zk" Apr 16 18:31:03.770534 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:03.770354 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:31:03.771147 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:03.770845 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" event={"ID":"b18d081b-3d3f-48e8-8f52-8eb619b60b77","Type":"ContainerStarted","Data":"c55cc61aa0d7dbf9d9c5f93cfee0b587d7cc119125c56e275ad6108d7f38f291"} Apr 16 18:31:03.771147 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:03.771121 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:31:03.771147 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:03.771141 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:31:03.771298 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:03.771222 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:31:03.771448 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:03.771426 2577 scope.go:117] "RemoveContainer" containerID="b95410f8fd761bb372d8b876a977990959b74413748769408450002424de64f3" Apr 16 18:31:03.772790 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:03.772740 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7ecb9d0-5eb7-46c9-b65f-725014636854" containerID="daade5155c673564c5765cf591639e2d822ed2448b009c58c41132bfeb0f281d" exitCode=0 Apr 16 18:31:03.772907 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:03.772797 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49rzx" event={"ID":"f7ecb9d0-5eb7-46c9-b65f-725014636854","Type":"ContainerDied","Data":"daade5155c673564c5765cf591639e2d822ed2448b009c58c41132bfeb0f281d"} Apr 16 18:31:03.787026 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:03.787008 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:31:03.787110 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:03.787067 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:31:04.613979 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:04.613948 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:31:04.614149 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:04.613989 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:31:04.614149 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:04.614082 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:31:04.614271 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:04.614178 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:31:04.778821 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:04.778793 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:31:04.779196 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:04.779164 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" event={"ID":"b18d081b-3d3f-48e8-8f52-8eb619b60b77","Type":"ContainerStarted","Data":"ee634e0e1ecc687e0f632e2aaa28f9241c90ab2160e28b1f4c9b91effa0e8259"} Apr 16 18:31:04.780979 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:04.780957 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7ecb9d0-5eb7-46c9-b65f-725014636854" containerID="885a2441888a9c8987fb26c551539eeed827c718bf9966dd8759175016d86722" exitCode=0 Apr 16 18:31:04.781068 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:04.781002 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49rzx" event={"ID":"f7ecb9d0-5eb7-46c9-b65f-725014636854","Type":"ContainerDied","Data":"885a2441888a9c8987fb26c551539eeed827c718bf9966dd8759175016d86722"} Apr 16 18:31:04.854257 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:04.854207 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" podStartSLOduration=9.62465092 podStartE2EDuration="26.854192104s" podCreationTimestamp="2026-04-16 18:30:38 +0000 UTC" firstStartedPulling="2026-04-16 18:30:41.245001636 +0000 UTC m=+3.190432324" lastFinishedPulling="2026-04-16 18:30:58.474542817 +0000 UTC m=+20.419973508" observedRunningTime="2026-04-16 18:31:04.826312953 +0000 UTC m=+26.771743664" watchObservedRunningTime="2026-04-16 18:31:04.854192104 +0000 UTC m=+26.799622791" Apr 16 18:31:05.250819 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:05.250789 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tldk9"] Apr 16 18:31:05.250961 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:05.250901 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:31:05.251004 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:05.250985 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:31:05.253835 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:05.253813 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-57qhk"] Apr 16 18:31:05.253938 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:05.253892 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:31:05.253974 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:05.253958 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:31:05.784587 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:05.784556 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7ecb9d0-5eb7-46c9-b65f-725014636854" containerID="43bd4ad3a6f3c43fc16a31b18ad7f4bb36a0b6df303b9f375182ef4906b74c6a" exitCode=0 Apr 16 18:31:05.785001 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:05.784629 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49rzx" event={"ID":"f7ecb9d0-5eb7-46c9-b65f-725014636854","Type":"ContainerDied","Data":"43bd4ad3a6f3c43fc16a31b18ad7f4bb36a0b6df303b9f375182ef4906b74c6a"} Apr 16 18:31:06.613739 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:06.613706 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:31:06.613915 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:06.613752 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:31:06.613915 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:06.613856 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:31:06.614009 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:06.613972 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:31:08.614958 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:08.614922 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:31:08.615641 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:08.615030 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:31:08.615641 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:08.615080 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:31:08.615641 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:08.615143 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:31:10.614059 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.614029 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:31:10.614529 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.614029 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:31:10.614529 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:10.614143 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tldk9" podUID="70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1" Apr 16 18:31:10.614529 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:10.614226 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57qhk" podUID="47c264de-a221-4aa7-8732-5a2e31ec7974" Apr 16 18:31:10.867801 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.867703 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-1.ec2.internal" event="NodeReady" Apr 16 18:31:10.867958 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.867867 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:31:10.925231 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.925201 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ldpjc"] Apr 16 18:31:10.962398 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.962364 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mstsd"] Apr 16 18:31:10.962548 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.962413 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:10.964873 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.964851 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-h4pkb\"" Apr 16 18:31:10.965039 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.964848 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:31:10.965143 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.964851 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:31:10.985578 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.985551 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ldpjc"] Apr 16 18:31:10.985578 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.985582 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mstsd"] Apr 16 18:31:10.985777 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.985699 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:31:10.988033 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.988008 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:31:10.988149 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.988043 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:31:10.988149 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.988014 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:31:10.988352 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:10.988316 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2pvvf\"" Apr 16 18:31:11.052746 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.052708 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2d4ed685-8585-4063-a50d-bab899fa550e-tmp-dir\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:11.052887 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.052809 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:11.052887 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.052854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d4ed685-8585-4063-a50d-bab899fa550e-config-volume\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:11.052986 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.052885 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qx24\" (UniqueName: \"kubernetes.io/projected/2d4ed685-8585-4063-a50d-bab899fa550e-kube-api-access-7qx24\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:11.153733 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.153650 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d4ed685-8585-4063-a50d-bab899fa550e-config-volume\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:11.153733 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.153689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:31:11.153733 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.153710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qx24\" (UniqueName: \"kubernetes.io/projected/2d4ed685-8585-4063-a50d-bab899fa550e-kube-api-access-7qx24\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:11.153975 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.153830 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm4lq\" (UniqueName: \"kubernetes.io/projected/5276ac45-8e09-409e-989a-d2ebdd40a1a4-kube-api-access-rm4lq\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:31:11.153975 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.153899 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2d4ed685-8585-4063-a50d-bab899fa550e-tmp-dir\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:11.153975 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.153972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:11.154103 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:11.154075 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:11.154159 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:11.154148 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls podName:2d4ed685-8585-4063-a50d-bab899fa550e nodeName:}" failed. No retries permitted until 2026-04-16 18:31:11.65412827 +0000 UTC m=+33.599558975 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls") pod "dns-default-ldpjc" (UID: "2d4ed685-8585-4063-a50d-bab899fa550e") : secret "dns-default-metrics-tls" not found Apr 16 18:31:11.154299 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.154279 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2d4ed685-8585-4063-a50d-bab899fa550e-tmp-dir\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:11.154402 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.154383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d4ed685-8585-4063-a50d-bab899fa550e-config-volume\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:11.163714 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.163696 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qx24\" (UniqueName: \"kubernetes.io/projected/2d4ed685-8585-4063-a50d-bab899fa550e-kube-api-access-7qx24\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:11.254871 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.254838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:31:11.255012 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.254898 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm4lq\" (UniqueName: \"kubernetes.io/projected/5276ac45-8e09-409e-989a-d2ebdd40a1a4-kube-api-access-rm4lq\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:31:11.255012 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:11.254990 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:11.255091 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:11.255054 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert podName:5276ac45-8e09-409e-989a-d2ebdd40a1a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:11.755035991 +0000 UTC m=+33.700466679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert") pod "ingress-canary-mstsd" (UID: "5276ac45-8e09-409e-989a-d2ebdd40a1a4") : secret "canary-serving-cert" not found Apr 16 18:31:11.266461 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.266438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm4lq\" (UniqueName: \"kubernetes.io/projected/5276ac45-8e09-409e-989a-d2ebdd40a1a4-kube-api-access-rm4lq\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:31:11.657378 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.657314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:11.657850 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:11.657454 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:11.657850 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:11.657517 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls podName:2d4ed685-8585-4063-a50d-bab899fa550e nodeName:}" failed. No retries permitted until 2026-04-16 18:31:12.657501323 +0000 UTC m=+34.602932015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls") pod "dns-default-ldpjc" (UID: "2d4ed685-8585-4063-a50d-bab899fa550e") : secret "dns-default-metrics-tls" not found Apr 16 18:31:11.758685 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.758520 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:31:11.758820 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:11.758662 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:11.758820 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:11.758805 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert podName:5276ac45-8e09-409e-989a-d2ebdd40a1a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:12.758790325 +0000 UTC m=+34.704221013 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert") pod "ingress-canary-mstsd" (UID: "5276ac45-8e09-409e-989a-d2ebdd40a1a4") : secret "canary-serving-cert" not found Apr 16 18:31:11.800490 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:11.800464 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49rzx" event={"ID":"f7ecb9d0-5eb7-46c9-b65f-725014636854","Type":"ContainerStarted","Data":"9680e022b994f6b033ef47f4c6092ddf45da835bcd70425b24198dffb1b918f2"} Apr 16 18:31:12.261432 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:12.261342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:31:12.261620 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:12.261481 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:12.261620 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:12.261545 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs podName:70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:44.261530522 +0000 UTC m=+66.206961214 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs") pod "network-metrics-daemon-tldk9" (UID: "70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:31:12.463273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:12.463239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76zw2\" (UniqueName: \"kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2\") pod \"network-check-target-57qhk\" (UID: \"47c264de-a221-4aa7-8732-5a2e31ec7974\") " pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:31:12.463458 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:12.463390 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:31:12.463458 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:12.463415 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:31:12.463458 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:12.463426 2577 projected.go:194] Error preparing data for projected volume kube-api-access-76zw2 for pod openshift-network-diagnostics/network-check-target-57qhk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:12.463557 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:12.463480 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2 podName:47c264de-a221-4aa7-8732-5a2e31ec7974 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:44.463465113 +0000 UTC m=+66.408895801 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-76zw2" (UniqueName: "kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2") pod "network-check-target-57qhk" (UID: "47c264de-a221-4aa7-8732-5a2e31ec7974") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:31:12.613612 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:12.613583 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:31:12.613831 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:12.613583 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:31:12.616929 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:12.616905 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:31:12.617164 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:12.617148 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:31:12.617164 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:12.617155 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-pk24s\"" Apr 16 18:31:12.617323 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:12.617166 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wddp7\"" Apr 16 18:31:12.617323 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:12.617155 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:31:12.664533 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:12.664502 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:12.664932 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:12.664601 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:12.664932 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:12.664648 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls podName:2d4ed685-8585-4063-a50d-bab899fa550e nodeName:}" failed. No retries permitted until 2026-04-16 18:31:14.664636044 +0000 UTC m=+36.610066732 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls") pod "dns-default-ldpjc" (UID: "2d4ed685-8585-4063-a50d-bab899fa550e") : secret "dns-default-metrics-tls" not found Apr 16 18:31:12.765740 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:12.765700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:31:12.765999 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:12.765865 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:12.765999 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:12.765945 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert podName:5276ac45-8e09-409e-989a-d2ebdd40a1a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:14.765926634 +0000 UTC m=+36.711357341 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert") pod "ingress-canary-mstsd" (UID: "5276ac45-8e09-409e-989a-d2ebdd40a1a4") : secret "canary-serving-cert" not found Apr 16 18:31:12.805276 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:12.805243 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7ecb9d0-5eb7-46c9-b65f-725014636854" containerID="9680e022b994f6b033ef47f4c6092ddf45da835bcd70425b24198dffb1b918f2" exitCode=0 Apr 16 18:31:12.805431 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:12.805292 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49rzx" event={"ID":"f7ecb9d0-5eb7-46c9-b65f-725014636854","Type":"ContainerDied","Data":"9680e022b994f6b033ef47f4c6092ddf45da835bcd70425b24198dffb1b918f2"} Apr 16 18:31:13.809929 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:13.809899 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7ecb9d0-5eb7-46c9-b65f-725014636854" containerID="4f2f4d34f2dfa3911ea849a4931c5a46f8b3011e504625c37c538fd57a189056" exitCode=0 Apr 16 18:31:13.810364 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:13.809974 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49rzx" event={"ID":"f7ecb9d0-5eb7-46c9-b65f-725014636854","Type":"ContainerDied","Data":"4f2f4d34f2dfa3911ea849a4931c5a46f8b3011e504625c37c538fd57a189056"} Apr 16 18:31:14.679117 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:14.679080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:14.679321 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:14.679266 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:14.679390 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:14.679350 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls podName:2d4ed685-8585-4063-a50d-bab899fa550e nodeName:}" failed. No retries permitted until 2026-04-16 18:31:18.67932556 +0000 UTC m=+40.624756248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls") pod "dns-default-ldpjc" (UID: "2d4ed685-8585-4063-a50d-bab899fa550e") : secret "dns-default-metrics-tls" not found Apr 16 18:31:14.779626 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:14.779585 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:31:14.779801 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:14.779758 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:14.779873 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:14.779851 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert podName:5276ac45-8e09-409e-989a-d2ebdd40a1a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:18.779834916 +0000 UTC m=+40.725265603 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert") pod "ingress-canary-mstsd" (UID: "5276ac45-8e09-409e-989a-d2ebdd40a1a4") : secret "canary-serving-cert" not found Apr 16 18:31:14.815125 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:14.815092 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49rzx" event={"ID":"f7ecb9d0-5eb7-46c9-b65f-725014636854","Type":"ContainerStarted","Data":"63ddd3668a8a878a02eeac62b7b204f0b96877ce35a9186f870e75721201e19f"} Apr 16 18:31:14.841522 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:14.841477 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-49rzx" podStartSLOduration=6.421460689 podStartE2EDuration="36.841462101s" podCreationTimestamp="2026-04-16 18:30:38 +0000 UTC" firstStartedPulling="2026-04-16 18:30:41.239156675 +0000 UTC m=+3.184587369" lastFinishedPulling="2026-04-16 18:31:11.659158079 +0000 UTC m=+33.604588781" observedRunningTime="2026-04-16 18:31:14.840022257 +0000 UTC m=+36.785452967" watchObservedRunningTime="2026-04-16 18:31:14.841462101 +0000 UTC m=+36.786892810" Apr 16 18:31:18.706732 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:18.706690 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:18.707189 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:18.706851 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:18.707189 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:18.706923 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls podName:2d4ed685-8585-4063-a50d-bab899fa550e nodeName:}" failed. No retries permitted until 2026-04-16 18:31:26.706902371 +0000 UTC m=+48.652333064 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls") pod "dns-default-ldpjc" (UID: "2d4ed685-8585-4063-a50d-bab899fa550e") : secret "dns-default-metrics-tls" not found Apr 16 18:31:18.807307 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:18.807278 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:31:18.807441 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:18.807423 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:18.807498 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:18.807486 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert podName:5276ac45-8e09-409e-989a-d2ebdd40a1a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:26.807468969 +0000 UTC m=+48.752899657 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert") pod "ingress-canary-mstsd" (UID: "5276ac45-8e09-409e-989a-d2ebdd40a1a4") : secret "canary-serving-cert" not found Apr 16 18:31:26.760086 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:26.760047 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:26.760523 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:26.760214 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:26.760523 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:26.760276 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls podName:2d4ed685-8585-4063-a50d-bab899fa550e nodeName:}" failed. No retries permitted until 2026-04-16 18:31:42.760261567 +0000 UTC m=+64.705692255 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls") pod "dns-default-ldpjc" (UID: "2d4ed685-8585-4063-a50d-bab899fa550e") : secret "dns-default-metrics-tls" not found Apr 16 18:31:26.860958 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:26.860928 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:31:26.861090 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:26.861063 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:26.861128 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:26.861117 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert podName:5276ac45-8e09-409e-989a-d2ebdd40a1a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:42.861103287 +0000 UTC m=+64.806533974 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert") pod "ingress-canary-mstsd" (UID: "5276ac45-8e09-409e-989a-d2ebdd40a1a4") : secret "canary-serving-cert" not found Apr 16 18:31:35.795382 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:35.795354 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dsp8f" Apr 16 18:31:42.765418 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:42.765377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:31:42.765912 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:42.765501 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:42.765912 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:42.765562 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls podName:2d4ed685-8585-4063-a50d-bab899fa550e nodeName:}" failed. No retries permitted until 2026-04-16 18:32:14.765548383 +0000 UTC m=+96.710979071 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls") pod "dns-default-ldpjc" (UID: "2d4ed685-8585-4063-a50d-bab899fa550e") : secret "dns-default-metrics-tls" not found Apr 16 18:31:42.866090 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:42.866062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:31:42.866260 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:42.866194 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:42.866260 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:42.866249 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert podName:5276ac45-8e09-409e-989a-d2ebdd40a1a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:14.866235974 +0000 UTC m=+96.811666665 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert") pod "ingress-canary-mstsd" (UID: "5276ac45-8e09-409e-989a-d2ebdd40a1a4") : secret "canary-serving-cert" not found Apr 16 18:31:44.275265 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:44.275232 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:31:44.278128 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:44.278103 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:31:44.285953 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:44.285935 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:31:44.286018 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:31:44.285993 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs podName:70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:48.285979023 +0000 UTC m=+130.231409711 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs") pod "network-metrics-daemon-tldk9" (UID: "70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1") : secret "metrics-daemon-secret" not found Apr 16 18:31:44.476973 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:44.476923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76zw2\" (UniqueName: \"kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2\") pod \"network-check-target-57qhk\" (UID: \"47c264de-a221-4aa7-8732-5a2e31ec7974\") " pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:31:44.479616 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:44.479599 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:31:44.489732 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:44.489699 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:31:44.501382 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:44.501354 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76zw2\" (UniqueName: \"kubernetes.io/projected/47c264de-a221-4aa7-8732-5a2e31ec7974-kube-api-access-76zw2\") pod \"network-check-target-57qhk\" (UID: \"47c264de-a221-4aa7-8732-5a2e31ec7974\") " pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:31:44.731601 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:44.731565 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-pk24s\"" Apr 16 18:31:44.739094 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:44.739069 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:31:44.859351 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:44.859321 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-57qhk"] Apr 16 18:31:44.864527 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:31:44.864489 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47c264de_a221_4aa7_8732_5a2e31ec7974.slice/crio-5cc98fe0b2a9227ea7ab428d276bdc54f1dbd45bc6c8fffd208b850140e0cf61 WatchSource:0}: Error finding container 5cc98fe0b2a9227ea7ab428d276bdc54f1dbd45bc6c8fffd208b850140e0cf61: Status 404 returned error can't find the container with id 5cc98fe0b2a9227ea7ab428d276bdc54f1dbd45bc6c8fffd208b850140e0cf61 Apr 16 18:31:44.871556 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:44.871512 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-57qhk" event={"ID":"47c264de-a221-4aa7-8732-5a2e31ec7974","Type":"ContainerStarted","Data":"5cc98fe0b2a9227ea7ab428d276bdc54f1dbd45bc6c8fffd208b850140e0cf61"} Apr 16 18:31:47.878803 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:47.878751 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-57qhk" event={"ID":"47c264de-a221-4aa7-8732-5a2e31ec7974","Type":"ContainerStarted","Data":"f5a5e7da441aa67736037d337c4efedecb1ca44781fab7a3dc8b46d158be07d8"} Apr 16 18:31:47.879178 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:47.878902 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:31:47.915497 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:31:47.915452 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-57qhk" podStartSLOduration=66.226656716 podStartE2EDuration="1m8.915438935s" podCreationTimestamp="2026-04-16 18:30:39 +0000 UTC" firstStartedPulling="2026-04-16 18:31:44.866030591 +0000 UTC m=+66.811461284" lastFinishedPulling="2026-04-16 18:31:47.554812813 +0000 UTC m=+69.500243503" observedRunningTime="2026-04-16 18:31:47.915209445 +0000 UTC m=+69.860640155" watchObservedRunningTime="2026-04-16 18:31:47.915438935 +0000 UTC m=+69.860869644" Apr 16 18:32:14.783248 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:14.783102 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:32:14.783839 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:14.783250 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:32:14.783839 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:14.783337 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls podName:2d4ed685-8585-4063-a50d-bab899fa550e nodeName:}" failed. No retries permitted until 2026-04-16 18:33:18.783312754 +0000 UTC m=+160.728743450 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls") pod "dns-default-ldpjc" (UID: "2d4ed685-8585-4063-a50d-bab899fa550e") : secret "dns-default-metrics-tls" not found Apr 16 18:32:14.883814 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:14.883754 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:32:14.883979 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:14.883905 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:32:14.883979 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:14.883978 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert podName:5276ac45-8e09-409e-989a-d2ebdd40a1a4 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:18.883961957 +0000 UTC m=+160.829392644 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert") pod "ingress-canary-mstsd" (UID: "5276ac45-8e09-409e-989a-d2ebdd40a1a4") : secret "canary-serving-cert" not found Apr 16 18:32:18.883568 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:18.883538 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-57qhk" Apr 16 18:32:24.830912 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.830876 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-8lld6"] Apr 16 18:32:24.832659 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.832644 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:24.836287 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.836266 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-p82dj\"" Apr 16 18:32:24.836419 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.836301 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 18:32:24.837042 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.837024 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:32:24.837042 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.837029 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 18:32:24.837213 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.837029 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 18:32:24.844331 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.844310 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 18:32:24.847640 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.847618 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-8lld6"] Apr 16 18:32:24.929308 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.929279 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wqxdz"] Apr 16 18:32:24.931123 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.931108 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wqxdz" Apr 16 18:32:24.933507 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.933489 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 18:32:24.933663 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.933647 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-7vv5r\"" Apr 16 18:32:24.933921 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.933905 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:32:24.944593 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.944572 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wqxdz"] Apr 16 18:32:24.953999 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.953980 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz5qv\" (UniqueName: \"kubernetes.io/projected/6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897-kube-api-access-zz5qv\") pod \"console-operator-d87b8d5fc-8lld6\" (UID: \"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897\") " pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:24.954084 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.954032 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897-config\") pod \"console-operator-d87b8d5fc-8lld6\" (UID: \"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897\") " pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:24.954084 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.954062 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw8j6\" (UniqueName: \"kubernetes.io/projected/aae33711-4f13-4a51-afba-c1684da3b750-kube-api-access-sw8j6\") pod \"volume-data-source-validator-7d955d5dd4-wqxdz\" (UID: \"aae33711-4f13-4a51-afba-c1684da3b750\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wqxdz" Apr 16 18:32:24.954084 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.954081 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897-trusted-ca\") pod \"console-operator-d87b8d5fc-8lld6\" (UID: \"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897\") " pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:24.954182 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:24.954116 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897-serving-cert\") pod \"console-operator-d87b8d5fc-8lld6\" (UID: \"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897\") " pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:25.055245 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.055215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897-config\") pod \"console-operator-d87b8d5fc-8lld6\" (UID: \"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897\") " pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:25.055245 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.055250 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sw8j6\" (UniqueName: \"kubernetes.io/projected/aae33711-4f13-4a51-afba-c1684da3b750-kube-api-access-sw8j6\") pod \"volume-data-source-validator-7d955d5dd4-wqxdz\" (UID: \"aae33711-4f13-4a51-afba-c1684da3b750\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wqxdz" Apr 16 18:32:25.055505 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.055268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897-trusted-ca\") pod \"console-operator-d87b8d5fc-8lld6\" (UID: \"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897\") " pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:25.055505 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.055310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897-serving-cert\") pod \"console-operator-d87b8d5fc-8lld6\" (UID: \"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897\") " pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:25.055505 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.055353 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz5qv\" (UniqueName: \"kubernetes.io/projected/6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897-kube-api-access-zz5qv\") pod \"console-operator-d87b8d5fc-8lld6\" (UID: \"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897\") " pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:25.056058 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.056032 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897-config\") pod \"console-operator-d87b8d5fc-8lld6\" (UID: \"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897\") " pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:25.056243 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.056225 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897-trusted-ca\") pod \"console-operator-d87b8d5fc-8lld6\" (UID: \"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897\") " pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:25.057678 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.057660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897-serving-cert\") pod \"console-operator-d87b8d5fc-8lld6\" (UID: \"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897\") " pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:25.073900 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.073875 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5646c9b8dd-rn92g"] Apr 16 18:32:25.075333 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.075314 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw8j6\" (UniqueName: \"kubernetes.io/projected/aae33711-4f13-4a51-afba-c1684da3b750-kube-api-access-sw8j6\") pod \"volume-data-source-validator-7d955d5dd4-wqxdz\" (UID: \"aae33711-4f13-4a51-afba-c1684da3b750\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wqxdz" Apr 16 18:32:25.075805 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.075791 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.079232 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.079214 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 18:32:25.079579 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.079559 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:32:25.079730 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.079580 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 18:32:25.079917 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.079614 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 18:32:25.079917 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.079810 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-p88bt\"" Apr 16 18:32:25.079917 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.079614 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 18:32:25.080112 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.080098 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:32:25.081869 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.081823 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz5qv\" (UniqueName: \"kubernetes.io/projected/6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897-kube-api-access-zz5qv\") pod \"console-operator-d87b8d5fc-8lld6\" (UID: \"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897\") " pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:25.111055 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.111033 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5646c9b8dd-rn92g"] Apr 16 18:32:25.146228 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.146197 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:25.156015 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.155994 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.156119 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.156063 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-default-certificate\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.156182 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.156133 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-stats-auth\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.156235 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.156200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5r4n\" (UniqueName: \"kubernetes.io/projected/d2936037-2326-466f-9946-5ddb752141d0-kube-api-access-g5r4n\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.156289 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.156253 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.171967 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.171945 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-fq9mb"] Apr 16 18:32:25.174119 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.174103 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.178064 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.178043 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 18:32:25.178453 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.178435 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-m2k52\"" Apr 16 18:32:25.178605 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.178592 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:32:25.179276 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.179259 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 18:32:25.179361 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.179346 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:32:25.184234 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.184217 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 18:32:25.194434 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.194411 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-fq9mb"] Apr 16 18:32:25.238997 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.238961 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wqxdz" Apr 16 18:32:25.256943 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.256914 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-default-certificate\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.257090 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.256950 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48eda739-7e21-4258-b417-fc943a77343a-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.257090 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.256985 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-stats-auth\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.257220 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.257109 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48eda739-7e21-4258-b417-fc943a77343a-tmp\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.257220 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.257157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48eda739-7e21-4258-b417-fc943a77343a-serving-cert\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.257220 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.257183 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9998f\" (UniqueName: \"kubernetes.io/projected/48eda739-7e21-4258-b417-fc943a77343a-kube-api-access-9998f\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.257220 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.257216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5r4n\" (UniqueName: \"kubernetes.io/projected/d2936037-2326-466f-9946-5ddb752141d0-kube-api-access-g5r4n\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.257413 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.257314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.257413 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.257360 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.257413 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.257391 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/48eda739-7e21-4258-b417-fc943a77343a-snapshots\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.257413 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.257409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48eda739-7e21-4258-b417-fc943a77343a-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.257605 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:25.257490 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:32:25.257605 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:25.257495 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle podName:d2936037-2326-466f-9946-5ddb752141d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:25.757475388 +0000 UTC m=+107.702906079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle") pod "router-default-5646c9b8dd-rn92g" (UID: "d2936037-2326-466f-9946-5ddb752141d0") : configmap references non-existent config key: service-ca.crt Apr 16 18:32:25.257605 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:25.257563 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs podName:d2936037-2326-466f-9946-5ddb752141d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:25.757543931 +0000 UTC m=+107.702974637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs") pod "router-default-5646c9b8dd-rn92g" (UID: "d2936037-2326-466f-9946-5ddb752141d0") : secret "router-metrics-certs-default" not found Apr 16 18:32:25.259972 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.259952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-default-certificate\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.260096 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.259982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-stats-auth\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.276397 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.276366 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5r4n\" (UniqueName: \"kubernetes.io/projected/d2936037-2326-466f-9946-5ddb752141d0-kube-api-access-g5r4n\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.292412 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.292356 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-8lld6"] Apr 16 18:32:25.299440 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:32:25.299410 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d3a1dd4_8bf2_4a6f_8e0f_ee72ea205897.slice/crio-69c820113e1390a53be8bab2398174466fa03d12d59f46a284cc48215837f4cf WatchSource:0}: Error finding container 69c820113e1390a53be8bab2398174466fa03d12d59f46a284cc48215837f4cf: Status 404 returned error can't find the container with id 69c820113e1390a53be8bab2398174466fa03d12d59f46a284cc48215837f4cf Apr 16 18:32:25.358285 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.358215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48eda739-7e21-4258-b417-fc943a77343a-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.358285 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.358250 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48eda739-7e21-4258-b417-fc943a77343a-tmp\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.358285 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.358273 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48eda739-7e21-4258-b417-fc943a77343a-serving-cert\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.358467 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.358404 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9998f\" (UniqueName: \"kubernetes.io/projected/48eda739-7e21-4258-b417-fc943a77343a-kube-api-access-9998f\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.358552 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.358536 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/48eda739-7e21-4258-b417-fc943a77343a-snapshots\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.358588 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.358564 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48eda739-7e21-4258-b417-fc943a77343a-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.358704 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.358678 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48eda739-7e21-4258-b417-fc943a77343a-tmp\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.359068 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.359037 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48eda739-7e21-4258-b417-fc943a77343a-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.359186 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.359107 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/48eda739-7e21-4258-b417-fc943a77343a-snapshots\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.359296 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.359280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48eda739-7e21-4258-b417-fc943a77343a-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.360783 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.360746 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48eda739-7e21-4258-b417-fc943a77343a-serving-cert\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.362645 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.362609 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wqxdz"] Apr 16 18:32:25.368013 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.367996 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9998f\" (UniqueName: \"kubernetes.io/projected/48eda739-7e21-4258-b417-fc943a77343a-kube-api-access-9998f\") pod \"insights-operator-5785d4fcdd-fq9mb\" (UID: \"48eda739-7e21-4258-b417-fc943a77343a\") " pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.483043 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.483015 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" Apr 16 18:32:25.593281 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.593249 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-fq9mb"] Apr 16 18:32:25.596093 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:32:25.596069 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48eda739_7e21_4258_b417_fc943a77343a.slice/crio-3c5502b43c36ab0ef494b2e9ac362b2cc5b91befcae938570f42163e91b176f3 WatchSource:0}: Error finding container 3c5502b43c36ab0ef494b2e9ac362b2cc5b91befcae938570f42163e91b176f3: Status 404 returned error can't find the container with id 3c5502b43c36ab0ef494b2e9ac362b2cc5b91befcae938570f42163e91b176f3 Apr 16 18:32:25.762484 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.762391 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.762484 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.762446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:25.762706 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:25.762567 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:32:25.762706 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:25.762588 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle podName:d2936037-2326-466f-9946-5ddb752141d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:26.762565831 +0000 UTC m=+108.707996534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle") pod "router-default-5646c9b8dd-rn92g" (UID: "d2936037-2326-466f-9946-5ddb752141d0") : configmap references non-existent config key: service-ca.crt Apr 16 18:32:25.762706 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:25.762628 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs podName:d2936037-2326-466f-9946-5ddb752141d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:26.762610492 +0000 UTC m=+108.708041184 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs") pod "router-default-5646c9b8dd-rn92g" (UID: "d2936037-2326-466f-9946-5ddb752141d0") : secret "router-metrics-certs-default" not found Apr 16 18:32:25.952420 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.952383 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" event={"ID":"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897","Type":"ContainerStarted","Data":"69c820113e1390a53be8bab2398174466fa03d12d59f46a284cc48215837f4cf"} Apr 16 18:32:25.953759 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.953722 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" event={"ID":"48eda739-7e21-4258-b417-fc943a77343a","Type":"ContainerStarted","Data":"3c5502b43c36ab0ef494b2e9ac362b2cc5b91befcae938570f42163e91b176f3"} Apr 16 18:32:25.954956 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:25.954923 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wqxdz" event={"ID":"aae33711-4f13-4a51-afba-c1684da3b750","Type":"ContainerStarted","Data":"9940d7bf085f993747f92a8e349dbdeddde9ee466fb1824959600c3f5f5e78c4"} Apr 16 18:32:26.770751 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:26.770694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:26.770937 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:26.770784 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:26.770937 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:26.770909 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle podName:d2936037-2326-466f-9946-5ddb752141d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:28.77088373 +0000 UTC m=+110.716314435 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle") pod "router-default-5646c9b8dd-rn92g" (UID: "d2936037-2326-466f-9946-5ddb752141d0") : configmap references non-existent config key: service-ca.crt Apr 16 18:32:26.771072 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:26.770975 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:32:26.771072 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:26.771030 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs podName:d2936037-2326-466f-9946-5ddb752141d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:28.771013636 +0000 UTC m=+110.716444325 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs") pod "router-default-5646c9b8dd-rn92g" (UID: "d2936037-2326-466f-9946-5ddb752141d0") : secret "router-metrics-certs-default" not found Apr 16 18:32:27.960848 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:27.960823 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/0.log" Apr 16 18:32:27.961217 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:27.960861 2577 generic.go:358] "Generic (PLEG): container finished" podID="6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897" containerID="6544a37fa31e15c19db236607f04a9e372c6451d0b8ceaef8d76176611e0d161" exitCode=255 Apr 16 18:32:27.961217 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:27.960895 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" event={"ID":"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897","Type":"ContainerDied","Data":"6544a37fa31e15c19db236607f04a9e372c6451d0b8ceaef8d76176611e0d161"} Apr 16 18:32:27.961217 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:27.961167 2577 scope.go:117] "RemoveContainer" containerID="6544a37fa31e15c19db236607f04a9e372c6451d0b8ceaef8d76176611e0d161" Apr 16 18:32:27.962193 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:27.962167 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" event={"ID":"48eda739-7e21-4258-b417-fc943a77343a","Type":"ContainerStarted","Data":"e213fcbc84b8ccb6247e136cde1c90004795502132cfd65742e641fd3498ce81"} Apr 16 18:32:27.963433 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:27.963415 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wqxdz" event={"ID":"aae33711-4f13-4a51-afba-c1684da3b750","Type":"ContainerStarted","Data":"65488657fb6488926a1313e034bda09318c7c2deebd7605724e182247d38855a"} Apr 16 18:32:27.999441 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:27.999362 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wqxdz" podStartSLOduration=1.595374848 podStartE2EDuration="3.9993476s" podCreationTimestamp="2026-04-16 18:32:24 +0000 UTC" firstStartedPulling="2026-04-16 18:32:25.366556547 +0000 UTC m=+107.311987234" lastFinishedPulling="2026-04-16 18:32:27.770529298 +0000 UTC m=+109.715959986" observedRunningTime="2026-04-16 18:32:27.998556662 +0000 UTC m=+109.943987387" watchObservedRunningTime="2026-04-16 18:32:27.9993476 +0000 UTC m=+109.944778309" Apr 16 18:32:28.019968 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.019931 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" podStartSLOduration=0.839217617 podStartE2EDuration="3.019917528s" podCreationTimestamp="2026-04-16 18:32:25 +0000 UTC" firstStartedPulling="2026-04-16 18:32:25.597807207 +0000 UTC m=+107.543237895" lastFinishedPulling="2026-04-16 18:32:27.778507103 +0000 UTC m=+109.723937806" observedRunningTime="2026-04-16 18:32:28.019544118 +0000 UTC m=+109.964974828" watchObservedRunningTime="2026-04-16 18:32:28.019917528 +0000 UTC m=+109.965348235" Apr 16 18:32:28.787263 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.787236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:28.787372 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.787279 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:28.787409 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:28.787395 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:32:28.787444 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:28.787409 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle podName:d2936037-2326-466f-9946-5ddb752141d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:32.787391937 +0000 UTC m=+114.732822629 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle") pod "router-default-5646c9b8dd-rn92g" (UID: "d2936037-2326-466f-9946-5ddb752141d0") : configmap references non-existent config key: service-ca.crt Apr 16 18:32:28.787444 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:28.787438 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs podName:d2936037-2326-466f-9946-5ddb752141d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:32.78742693 +0000 UTC m=+114.732857617 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs") pod "router-default-5646c9b8dd-rn92g" (UID: "d2936037-2326-466f-9946-5ddb752141d0") : secret "router-metrics-certs-default" not found Apr 16 18:32:28.826463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.826437 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b"] Apr 16 18:32:28.829265 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.829250 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" Apr 16 18:32:28.832943 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.832919 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 18:32:28.832943 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.832938 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:32:28.833120 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.833102 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 18:32:28.833350 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.833337 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 18:32:28.834124 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.834084 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-jcwgb\"" Apr 16 18:32:28.841873 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.841852 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b"] Apr 16 18:32:28.887634 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.887599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6514a34b-67e6-4daf-a518-91f2a7316066-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-2m45b\" (UID: \"6514a34b-67e6-4daf-a518-91f2a7316066\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" Apr 16 18:32:28.887634 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.887634 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6514a34b-67e6-4daf-a518-91f2a7316066-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-2m45b\" (UID: \"6514a34b-67e6-4daf-a518-91f2a7316066\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" Apr 16 18:32:28.887814 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.887714 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s26xb\" (UniqueName: \"kubernetes.io/projected/6514a34b-67e6-4daf-a518-91f2a7316066-kube-api-access-s26xb\") pod \"kube-storage-version-migrator-operator-756bb7d76f-2m45b\" (UID: \"6514a34b-67e6-4daf-a518-91f2a7316066\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" Apr 16 18:32:28.967556 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.967532 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/1.log" Apr 16 18:32:28.967952 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.967926 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/0.log" Apr 16 18:32:28.967991 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.967958 2577 generic.go:358] "Generic (PLEG): container finished" podID="6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897" containerID="580462baa069fcef65dfd137eeb3d42987abc03679cd4bc8ed79525e08d9d236" exitCode=255 Apr 16 18:32:28.968081 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.968047 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" event={"ID":"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897","Type":"ContainerDied","Data":"580462baa069fcef65dfd137eeb3d42987abc03679cd4bc8ed79525e08d9d236"} Apr 16 18:32:28.968139 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.968101 2577 scope.go:117] "RemoveContainer" containerID="6544a37fa31e15c19db236607f04a9e372c6451d0b8ceaef8d76176611e0d161" Apr 16 18:32:28.968323 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.968308 2577 scope.go:117] "RemoveContainer" containerID="580462baa069fcef65dfd137eeb3d42987abc03679cd4bc8ed79525e08d9d236" Apr 16 18:32:28.968548 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:28.968524 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-8lld6_openshift-console-operator(6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" podUID="6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897" Apr 16 18:32:28.988918 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.988892 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6514a34b-67e6-4daf-a518-91f2a7316066-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-2m45b\" (UID: \"6514a34b-67e6-4daf-a518-91f2a7316066\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" Apr 16 18:32:28.989023 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.988920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6514a34b-67e6-4daf-a518-91f2a7316066-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-2m45b\" (UID: \"6514a34b-67e6-4daf-a518-91f2a7316066\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" Apr 16 18:32:28.989023 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.988960 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s26xb\" (UniqueName: \"kubernetes.io/projected/6514a34b-67e6-4daf-a518-91f2a7316066-kube-api-access-s26xb\") pod \"kube-storage-version-migrator-operator-756bb7d76f-2m45b\" (UID: \"6514a34b-67e6-4daf-a518-91f2a7316066\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" Apr 16 18:32:28.989476 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.989450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6514a34b-67e6-4daf-a518-91f2a7316066-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-2m45b\" (UID: \"6514a34b-67e6-4daf-a518-91f2a7316066\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" Apr 16 18:32:28.990902 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.990883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6514a34b-67e6-4daf-a518-91f2a7316066-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-2m45b\" (UID: \"6514a34b-67e6-4daf-a518-91f2a7316066\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" Apr 16 18:32:28.997292 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:28.997272 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s26xb\" (UniqueName: \"kubernetes.io/projected/6514a34b-67e6-4daf-a518-91f2a7316066-kube-api-access-s26xb\") pod \"kube-storage-version-migrator-operator-756bb7d76f-2m45b\" (UID: \"6514a34b-67e6-4daf-a518-91f2a7316066\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" Apr 16 18:32:29.137625 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:29.137593 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" Apr 16 18:32:29.252137 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:29.252105 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b"] Apr 16 18:32:29.255103 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:32:29.255073 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6514a34b_67e6_4daf_a518_91f2a7316066.slice/crio-0c94985e5df7858752c2c4edb5cf05ab8c85f60f313474b27120b363be672f3a WatchSource:0}: Error finding container 0c94985e5df7858752c2c4edb5cf05ab8c85f60f313474b27120b363be672f3a: Status 404 returned error can't find the container with id 0c94985e5df7858752c2c4edb5cf05ab8c85f60f313474b27120b363be672f3a Apr 16 18:32:29.971614 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:29.971583 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/1.log" Apr 16 18:32:29.972084 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:29.971980 2577 scope.go:117] "RemoveContainer" containerID="580462baa069fcef65dfd137eeb3d42987abc03679cd4bc8ed79525e08d9d236" Apr 16 18:32:29.972248 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:29.972196 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-8lld6_openshift-console-operator(6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" podUID="6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897" Apr 16 18:32:29.972951 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:29.972929 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" event={"ID":"6514a34b-67e6-4daf-a518-91f2a7316066","Type":"ContainerStarted","Data":"0c94985e5df7858752c2c4edb5cf05ab8c85f60f313474b27120b363be672f3a"} Apr 16 18:32:31.819594 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:31.819559 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97"] Apr 16 18:32:31.822634 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:31.822611 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" Apr 16 18:32:31.825428 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:31.825405 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 18:32:31.825538 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:31.825464 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-z5j9r\"" Apr 16 18:32:31.826000 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:31.825984 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 18:32:31.826054 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:31.825986 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:32:31.826054 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:31.826034 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 18:32:31.841554 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:31.841530 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97"] Apr 16 18:32:31.913217 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:31.913190 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a93e4910-3556-4922-b202-bb6fcadfd443-serving-cert\") pod \"service-ca-operator-69965bb79d-zbl97\" (UID: \"a93e4910-3556-4922-b202-bb6fcadfd443\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" Apr 16 18:32:31.913362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:31.913245 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93e4910-3556-4922-b202-bb6fcadfd443-config\") pod \"service-ca-operator-69965bb79d-zbl97\" (UID: \"a93e4910-3556-4922-b202-bb6fcadfd443\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" Apr 16 18:32:31.913362 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:31.913294 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfc5c\" (UniqueName: \"kubernetes.io/projected/a93e4910-3556-4922-b202-bb6fcadfd443-kube-api-access-lfc5c\") pod \"service-ca-operator-69965bb79d-zbl97\" (UID: \"a93e4910-3556-4922-b202-bb6fcadfd443\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" Apr 16 18:32:31.977710 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:31.977665 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" event={"ID":"6514a34b-67e6-4daf-a518-91f2a7316066","Type":"ContainerStarted","Data":"29b583d0f36c8946abbf035315b9c7d82ba7121ffcc11cc9a1289956a1bd45e5"} Apr 16 18:32:32.009635 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:32.009595 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" podStartSLOduration=2.097957633 podStartE2EDuration="4.009582305s" podCreationTimestamp="2026-04-16 18:32:28 +0000 UTC" firstStartedPulling="2026-04-16 18:32:29.256862088 +0000 UTC m=+111.202292776" lastFinishedPulling="2026-04-16 18:32:31.16848676 +0000 UTC m=+113.113917448" observedRunningTime="2026-04-16 18:32:32.007705023 +0000 UTC m=+113.953135751" watchObservedRunningTime="2026-04-16 18:32:32.009582305 +0000 UTC m=+113.955013014" Apr 16 18:32:32.013690 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:32.013660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a93e4910-3556-4922-b202-bb6fcadfd443-serving-cert\") pod \"service-ca-operator-69965bb79d-zbl97\" (UID: \"a93e4910-3556-4922-b202-bb6fcadfd443\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" Apr 16 18:32:32.013820 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:32.013700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93e4910-3556-4922-b202-bb6fcadfd443-config\") pod \"service-ca-operator-69965bb79d-zbl97\" (UID: \"a93e4910-3556-4922-b202-bb6fcadfd443\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" Apr 16 18:32:32.013940 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:32.013919 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfc5c\" (UniqueName: \"kubernetes.io/projected/a93e4910-3556-4922-b202-bb6fcadfd443-kube-api-access-lfc5c\") pod \"service-ca-operator-69965bb79d-zbl97\" (UID: \"a93e4910-3556-4922-b202-bb6fcadfd443\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" Apr 16 18:32:32.014277 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:32.014250 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93e4910-3556-4922-b202-bb6fcadfd443-config\") pod \"service-ca-operator-69965bb79d-zbl97\" (UID: \"a93e4910-3556-4922-b202-bb6fcadfd443\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" Apr 16 18:32:32.016000 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:32.015982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a93e4910-3556-4922-b202-bb6fcadfd443-serving-cert\") pod \"service-ca-operator-69965bb79d-zbl97\" (UID: \"a93e4910-3556-4922-b202-bb6fcadfd443\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" Apr 16 18:32:32.083133 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:32.083061 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfc5c\" (UniqueName: \"kubernetes.io/projected/a93e4910-3556-4922-b202-bb6fcadfd443-kube-api-access-lfc5c\") pod \"service-ca-operator-69965bb79d-zbl97\" (UID: \"a93e4910-3556-4922-b202-bb6fcadfd443\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" Apr 16 18:32:32.132120 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:32.132090 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" Apr 16 18:32:32.249546 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:32.249517 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97"] Apr 16 18:32:32.252834 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:32:32.252807 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda93e4910_3556_4922_b202_bb6fcadfd443.slice/crio-cd3dc43ef3c73251b11be15971323b4565871a2fe161b60a6baf934bca8722b5 WatchSource:0}: Error finding container cd3dc43ef3c73251b11be15971323b4565871a2fe161b60a6baf934bca8722b5: Status 404 returned error can't find the container with id cd3dc43ef3c73251b11be15971323b4565871a2fe161b60a6baf934bca8722b5 Apr 16 18:32:32.563219 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:32.563142 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7jl4x_b86bb118-f0ab-4605-860a-df81a23f9124/dns-node-resolver/0.log" Apr 16 18:32:32.822543 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:32.822511 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:32.823005 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:32.822559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:32.823005 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:32.822675 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:32:32.823005 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:32.822710 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle podName:d2936037-2326-466f-9946-5ddb752141d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:40.8226866 +0000 UTC m=+122.768117304 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle") pod "router-default-5646c9b8dd-rn92g" (UID: "d2936037-2326-466f-9946-5ddb752141d0") : configmap references non-existent config key: service-ca.crt Apr 16 18:32:32.823005 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:32.822742 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs podName:d2936037-2326-466f-9946-5ddb752141d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:40.822729934 +0000 UTC m=+122.768160625 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs") pod "router-default-5646c9b8dd-rn92g" (UID: "d2936037-2326-466f-9946-5ddb752141d0") : secret "router-metrics-certs-default" not found Apr 16 18:32:32.981578 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:32.981544 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" event={"ID":"a93e4910-3556-4922-b202-bb6fcadfd443","Type":"ContainerStarted","Data":"cd3dc43ef3c73251b11be15971323b4565871a2fe161b60a6baf934bca8722b5"} Apr 16 18:32:33.561025 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:33.560995 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7rdcc_8f66e95f-32ea-4c62-b967-18110b01aac3/node-ca/0.log" Apr 16 18:32:33.984693 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:33.984648 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" event={"ID":"a93e4910-3556-4922-b202-bb6fcadfd443","Type":"ContainerStarted","Data":"bb2ae1610c75166be92bde3361da2ddd03263ea39fdc7bc8390dd3594fad461c"} Apr 16 18:32:34.008190 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:34.008137 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" podStartSLOduration=1.405491657 podStartE2EDuration="3.008121527s" podCreationTimestamp="2026-04-16 18:32:31 +0000 UTC" firstStartedPulling="2026-04-16 18:32:32.255107279 +0000 UTC m=+114.200537968" lastFinishedPulling="2026-04-16 18:32:33.857737147 +0000 UTC m=+115.803167838" observedRunningTime="2026-04-16 18:32:34.007832001 +0000 UTC m=+115.953262713" watchObservedRunningTime="2026-04-16 18:32:34.008121527 +0000 UTC m=+115.953552238" Apr 16 18:32:35.147147 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:35.147114 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:35.147147 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:35.147148 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:35.147538 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:35.147498 2577 scope.go:117] "RemoveContainer" containerID="580462baa069fcef65dfd137eeb3d42987abc03679cd4bc8ed79525e08d9d236" Apr 16 18:32:35.147669 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:35.147652 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-8lld6_openshift-console-operator(6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" podUID="6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897" Apr 16 18:32:37.522567 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.522528 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-6n9ln"] Apr 16 18:32:37.525510 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.525488 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" Apr 16 18:32:37.530516 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.530390 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 18:32:37.530516 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.530448 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 18:32:37.531122 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.531100 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 18:32:37.531323 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.531099 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 18:32:37.531424 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.531124 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-ssl48\"" Apr 16 18:32:37.541888 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.541868 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-6n9ln"] Apr 16 18:32:37.656020 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.655971 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a1361d1a-a80c-4255-8b03-7b777717f079-signing-key\") pod \"service-ca-bfc587fb7-6n9ln\" (UID: \"a1361d1a-a80c-4255-8b03-7b777717f079\") " pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" Apr 16 18:32:37.656020 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.656024 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a1361d1a-a80c-4255-8b03-7b777717f079-signing-cabundle\") pod \"service-ca-bfc587fb7-6n9ln\" (UID: \"a1361d1a-a80c-4255-8b03-7b777717f079\") " pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" Apr 16 18:32:37.656224 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.656095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brw4n\" (UniqueName: \"kubernetes.io/projected/a1361d1a-a80c-4255-8b03-7b777717f079-kube-api-access-brw4n\") pod \"service-ca-bfc587fb7-6n9ln\" (UID: \"a1361d1a-a80c-4255-8b03-7b777717f079\") " pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" Apr 16 18:32:37.757253 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.757216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a1361d1a-a80c-4255-8b03-7b777717f079-signing-cabundle\") pod \"service-ca-bfc587fb7-6n9ln\" (UID: \"a1361d1a-a80c-4255-8b03-7b777717f079\") " pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" Apr 16 18:32:37.757394 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.757261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brw4n\" (UniqueName: \"kubernetes.io/projected/a1361d1a-a80c-4255-8b03-7b777717f079-kube-api-access-brw4n\") pod \"service-ca-bfc587fb7-6n9ln\" (UID: \"a1361d1a-a80c-4255-8b03-7b777717f079\") " pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" Apr 16 18:32:37.757394 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.757336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a1361d1a-a80c-4255-8b03-7b777717f079-signing-key\") pod \"service-ca-bfc587fb7-6n9ln\" (UID: \"a1361d1a-a80c-4255-8b03-7b777717f079\") " pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" Apr 16 18:32:37.757925 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.757895 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a1361d1a-a80c-4255-8b03-7b777717f079-signing-cabundle\") pod \"service-ca-bfc587fb7-6n9ln\" (UID: \"a1361d1a-a80c-4255-8b03-7b777717f079\") " pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" Apr 16 18:32:37.759649 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.759632 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a1361d1a-a80c-4255-8b03-7b777717f079-signing-key\") pod \"service-ca-bfc587fb7-6n9ln\" (UID: \"a1361d1a-a80c-4255-8b03-7b777717f079\") " pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" Apr 16 18:32:37.767243 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.767221 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brw4n\" (UniqueName: \"kubernetes.io/projected/a1361d1a-a80c-4255-8b03-7b777717f079-kube-api-access-brw4n\") pod \"service-ca-bfc587fb7-6n9ln\" (UID: \"a1361d1a-a80c-4255-8b03-7b777717f079\") " pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" Apr 16 18:32:37.834318 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.834291 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" Apr 16 18:32:37.961331 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.961281 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-6n9ln"] Apr 16 18:32:37.964738 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:32:37.964707 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1361d1a_a80c_4255_8b03_7b777717f079.slice/crio-93a97cde6ee27ec104e4ad283654238d4df0b8da832c40f7e6bec62810a55057 WatchSource:0}: Error finding container 93a97cde6ee27ec104e4ad283654238d4df0b8da832c40f7e6bec62810a55057: Status 404 returned error can't find the container with id 93a97cde6ee27ec104e4ad283654238d4df0b8da832c40f7e6bec62810a55057 Apr 16 18:32:37.994514 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:37.994490 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" event={"ID":"a1361d1a-a80c-4255-8b03-7b777717f079","Type":"ContainerStarted","Data":"93a97cde6ee27ec104e4ad283654238d4df0b8da832c40f7e6bec62810a55057"} Apr 16 18:32:38.998339 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:38.998303 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" event={"ID":"a1361d1a-a80c-4255-8b03-7b777717f079","Type":"ContainerStarted","Data":"c02839d42489eb11e6385fcf40b401a4b5460cde1c6ab884df384730dea40923"} Apr 16 18:32:39.016198 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:39.016148 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-6n9ln" podStartSLOduration=2.016132778 podStartE2EDuration="2.016132778s" podCreationTimestamp="2026-04-16 18:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:32:39.015402877 +0000 UTC m=+120.960833612" watchObservedRunningTime="2026-04-16 18:32:39.016132778 +0000 UTC m=+120.961563545" Apr 16 18:32:40.882423 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:40.882383 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:40.882423 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:40.882425 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:40.882913 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:40.882549 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:32:40.882913 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:40.882561 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle podName:d2936037-2326-466f-9946-5ddb752141d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:56.882543662 +0000 UTC m=+138.827974353 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle") pod "router-default-5646c9b8dd-rn92g" (UID: "d2936037-2326-466f-9946-5ddb752141d0") : configmap references non-existent config key: service-ca.crt Apr 16 18:32:40.882913 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:40.882615 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs podName:d2936037-2326-466f-9946-5ddb752141d0 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:56.882598632 +0000 UTC m=+138.828029338 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs") pod "router-default-5646c9b8dd-rn92g" (UID: "d2936037-2326-466f-9946-5ddb752141d0") : secret "router-metrics-certs-default" not found Apr 16 18:32:48.343881 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:48.343847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:32:48.346078 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:48.346055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1-metrics-certs\") pod \"network-metrics-daemon-tldk9\" (UID: \"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1\") " pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:32:48.616086 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:48.616011 2577 scope.go:117] "RemoveContainer" containerID="580462baa069fcef65dfd137eeb3d42987abc03679cd4bc8ed79525e08d9d236" Apr 16 18:32:48.626482 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:48.626412 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wddp7\"" Apr 16 18:32:48.634829 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:48.634806 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tldk9" Apr 16 18:32:48.755939 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:48.755910 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tldk9"] Apr 16 18:32:48.789569 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:32:48.789541 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70f7edcb_5c76_4d9d_b5ed_8a4093c9dda1.slice/crio-efa19fd9aa0bbef628afb8c21d013a6800fafc6550b4609b93c68cb97c072107 WatchSource:0}: Error finding container efa19fd9aa0bbef628afb8c21d013a6800fafc6550b4609b93c68cb97c072107: Status 404 returned error can't find the container with id efa19fd9aa0bbef628afb8c21d013a6800fafc6550b4609b93c68cb97c072107 Apr 16 18:32:49.029973 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:49.029877 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tldk9" event={"ID":"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1","Type":"ContainerStarted","Data":"efa19fd9aa0bbef628afb8c21d013a6800fafc6550b4609b93c68cb97c072107"} Apr 16 18:32:49.031242 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:49.031222 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:32:49.031600 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:49.031585 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/1.log" Apr 16 18:32:49.031661 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:49.031615 2577 generic.go:358] "Generic (PLEG): container finished" podID="6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897" containerID="b2ae6af12871c4ac9996e53aa34fb42dfcd6f94da46e926c627c4220c5f50bd8" exitCode=255 Apr 16 18:32:49.031661 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:49.031648 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" event={"ID":"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897","Type":"ContainerDied","Data":"b2ae6af12871c4ac9996e53aa34fb42dfcd6f94da46e926c627c4220c5f50bd8"} Apr 16 18:32:49.031746 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:49.031669 2577 scope.go:117] "RemoveContainer" containerID="580462baa069fcef65dfd137eeb3d42987abc03679cd4bc8ed79525e08d9d236" Apr 16 18:32:49.032048 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:49.032030 2577 scope.go:117] "RemoveContainer" containerID="b2ae6af12871c4ac9996e53aa34fb42dfcd6f94da46e926c627c4220c5f50bd8" Apr 16 18:32:49.032241 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:49.032224 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-8lld6_openshift-console-operator(6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" podUID="6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897" Apr 16 18:32:50.036399 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:50.036369 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:32:50.038321 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:50.038282 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tldk9" event={"ID":"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1","Type":"ContainerStarted","Data":"a021b27614d252580bb4914a8083ea335a779d2bfa39ba907a0517652a78abcb"} Apr 16 18:32:51.042996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:51.042952 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tldk9" event={"ID":"70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1","Type":"ContainerStarted","Data":"9ac7ace76f12587ddf07ef86440782fc039585a64ab30df980b01ac0a1e58c82"} Apr 16 18:32:51.060290 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:51.060245 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tldk9" podStartSLOduration=131.988917604 podStartE2EDuration="2m13.060230712s" podCreationTimestamp="2026-04-16 18:30:38 +0000 UTC" firstStartedPulling="2026-04-16 18:32:48.791501823 +0000 UTC m=+130.736932511" lastFinishedPulling="2026-04-16 18:32:49.862814919 +0000 UTC m=+131.808245619" observedRunningTime="2026-04-16 18:32:51.059364347 +0000 UTC m=+133.004795057" watchObservedRunningTime="2026-04-16 18:32:51.060230712 +0000 UTC m=+133.005661448" Apr 16 18:32:55.146569 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:55.146536 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:55.146569 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:55.146576 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:32:55.146988 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:55.146913 2577 scope.go:117] "RemoveContainer" containerID="b2ae6af12871c4ac9996e53aa34fb42dfcd6f94da46e926c627c4220c5f50bd8" Apr 16 18:32:55.147097 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:32:55.147079 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-8lld6_openshift-console-operator(6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" podUID="6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897" Apr 16 18:32:56.915127 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:56.915073 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:56.915127 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:56.915125 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:56.916259 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:56.916235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2936037-2326-466f-9946-5ddb752141d0-service-ca-bundle\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:56.917417 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:56.917391 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2936037-2326-466f-9946-5ddb752141d0-metrics-certs\") pod \"router-default-5646c9b8dd-rn92g\" (UID: \"d2936037-2326-466f-9946-5ddb752141d0\") " pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:57.193628 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:57.193549 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-p88bt\"" Apr 16 18:32:57.202026 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:57.202006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:57.330833 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:57.330797 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5646c9b8dd-rn92g"] Apr 16 18:32:57.334027 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:32:57.333999 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2936037_2326_466f_9946_5ddb752141d0.slice/crio-d4944494bd15866854bcdf762fc06b4680b17b7a14f6f33f1e0a9165558c1bb9 WatchSource:0}: Error finding container d4944494bd15866854bcdf762fc06b4680b17b7a14f6f33f1e0a9165558c1bb9: Status 404 returned error can't find the container with id d4944494bd15866854bcdf762fc06b4680b17b7a14f6f33f1e0a9165558c1bb9 Apr 16 18:32:58.062170 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.062133 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5646c9b8dd-rn92g" event={"ID":"d2936037-2326-466f-9946-5ddb752141d0","Type":"ContainerStarted","Data":"7869fa114de34f3a54fb6f484bf9ae939b7c7b6401efdd1e1b59282dfe7a0045"} Apr 16 18:32:58.062170 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.062170 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5646c9b8dd-rn92g" event={"ID":"d2936037-2326-466f-9946-5ddb752141d0","Type":"ContainerStarted","Data":"d4944494bd15866854bcdf762fc06b4680b17b7a14f6f33f1e0a9165558c1bb9"} Apr 16 18:32:58.083397 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.083357 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5646c9b8dd-rn92g" podStartSLOduration=33.083343734 podStartE2EDuration="33.083343734s" podCreationTimestamp="2026-04-16 18:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:32:58.082322237 +0000 UTC m=+140.027752948" watchObservedRunningTime="2026-04-16 18:32:58.083343734 +0000 UTC m=+140.028774444" Apr 16 18:32:58.203036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.203007 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:58.205497 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.205474 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:58.435987 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.435947 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9s2zk"] Apr 16 18:32:58.439202 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.439185 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.441620 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.441597 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:32:58.441620 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.441600 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-ppqlw\"" Apr 16 18:32:58.441834 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.441691 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:32:58.451183 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.451160 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9s2zk"] Apr 16 18:32:58.525062 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.525036 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-84b8f69b7d-8pb5p"] Apr 16 18:32:58.527138 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.527118 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c6f20480-57b7-41d9-b9fb-06c81c82803c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.527244 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.527167 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c6f20480-57b7-41d9-b9fb-06c81c82803c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.527244 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.527190 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c6f20480-57b7-41d9-b9fb-06c81c82803c-data-volume\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.527347 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.527310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c6f20480-57b7-41d9-b9fb-06c81c82803c-crio-socket\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.527396 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.527376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbl2m\" (UniqueName: \"kubernetes.io/projected/c6f20480-57b7-41d9-b9fb-06c81c82803c-kube-api-access-wbl2m\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.527946 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.527933 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.533607 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.533585 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:32:58.536376 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.536017 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:32:58.536376 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.536274 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5jftk\"" Apr 16 18:32:58.539996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.539928 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:32:58.542211 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.542190 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:32:58.553785 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.553742 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84b8f69b7d-8pb5p"] Apr 16 18:32:58.628534 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.628507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c6f20480-57b7-41d9-b9fb-06c81c82803c-crio-socket\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.628666 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.628541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/689117e1-30ad-4535-910e-895627fda928-installation-pull-secrets\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.628666 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.628576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/689117e1-30ad-4535-910e-895627fda928-bound-sa-token\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.628666 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.628608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbl2m\" (UniqueName: \"kubernetes.io/projected/c6f20480-57b7-41d9-b9fb-06c81c82803c-kube-api-access-wbl2m\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.628666 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.628620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c6f20480-57b7-41d9-b9fb-06c81c82803c-crio-socket\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.628853 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.628677 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/689117e1-30ad-4535-910e-895627fda928-image-registry-private-configuration\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.628853 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.628726 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwtf9\" (UniqueName: \"kubernetes.io/projected/689117e1-30ad-4535-910e-895627fda928-kube-api-access-bwtf9\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.628853 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.628781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c6f20480-57b7-41d9-b9fb-06c81c82803c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.628853 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.628811 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/689117e1-30ad-4535-910e-895627fda928-ca-trust-extracted\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.629041 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.628867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c6f20480-57b7-41d9-b9fb-06c81c82803c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.629041 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.628896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c6f20480-57b7-41d9-b9fb-06c81c82803c-data-volume\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.629041 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.628937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/689117e1-30ad-4535-910e-895627fda928-registry-tls\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.629041 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.628976 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/689117e1-30ad-4535-910e-895627fda928-trusted-ca\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.629227 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.629047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/689117e1-30ad-4535-910e-895627fda928-registry-certificates\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.629281 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.629248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c6f20480-57b7-41d9-b9fb-06c81c82803c-data-volume\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.629438 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.629418 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c6f20480-57b7-41d9-b9fb-06c81c82803c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.631211 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.631187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c6f20480-57b7-41d9-b9fb-06c81c82803c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.642481 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.642461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbl2m\" (UniqueName: \"kubernetes.io/projected/c6f20480-57b7-41d9-b9fb-06c81c82803c-kube-api-access-wbl2m\") pod \"insights-runtime-extractor-9s2zk\" (UID: \"c6f20480-57b7-41d9-b9fb-06c81c82803c\") " pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.730279 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.730203 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/689117e1-30ad-4535-910e-895627fda928-image-registry-private-configuration\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.730279 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.730243 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwtf9\" (UniqueName: \"kubernetes.io/projected/689117e1-30ad-4535-910e-895627fda928-kube-api-access-bwtf9\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.730463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.730283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/689117e1-30ad-4535-910e-895627fda928-ca-trust-extracted\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.730463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.730317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/689117e1-30ad-4535-910e-895627fda928-registry-tls\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.730463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.730340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/689117e1-30ad-4535-910e-895627fda928-trusted-ca\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.730463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.730384 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/689117e1-30ad-4535-910e-895627fda928-registry-certificates\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.730653 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.730519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/689117e1-30ad-4535-910e-895627fda928-installation-pull-secrets\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.730653 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.730569 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/689117e1-30ad-4535-910e-895627fda928-bound-sa-token\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.730915 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.730878 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/689117e1-30ad-4535-910e-895627fda928-ca-trust-extracted\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.731404 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.731379 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/689117e1-30ad-4535-910e-895627fda928-registry-certificates\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.731648 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.731629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/689117e1-30ad-4535-910e-895627fda928-trusted-ca\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.732710 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.732685 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/689117e1-30ad-4535-910e-895627fda928-image-registry-private-configuration\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.732812 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.732793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/689117e1-30ad-4535-910e-895627fda928-installation-pull-secrets\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.732959 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.732928 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/689117e1-30ad-4535-910e-895627fda928-registry-tls\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.741059 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.741035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/689117e1-30ad-4535-910e-895627fda928-bound-sa-token\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.741905 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.741888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwtf9\" (UniqueName: \"kubernetes.io/projected/689117e1-30ad-4535-910e-895627fda928-kube-api-access-bwtf9\") pod \"image-registry-84b8f69b7d-8pb5p\" (UID: \"689117e1-30ad-4535-910e-895627fda928\") " pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.747321 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.747294 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9s2zk" Apr 16 18:32:58.836507 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.836482 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:32:58.874590 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.874563 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9s2zk"] Apr 16 18:32:58.875787 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:32:58.875735 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6f20480_57b7_41d9_b9fb_06c81c82803c.slice/crio-a2a34c7c90e54396b88c8c2c382c8a3f80b6a0e6a9cc28acee43963d3130acb6 WatchSource:0}: Error finding container a2a34c7c90e54396b88c8c2c382c8a3f80b6a0e6a9cc28acee43963d3130acb6: Status 404 returned error can't find the container with id a2a34c7c90e54396b88c8c2c382c8a3f80b6a0e6a9cc28acee43963d3130acb6 Apr 16 18:32:58.992353 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:58.992255 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84b8f69b7d-8pb5p"] Apr 16 18:32:58.995901 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:32:58.995874 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod689117e1_30ad_4535_910e_895627fda928.slice/crio-aa1426e0fa9932c208c6a9fe0bb6d22ba75ac9763f3fd5f0c69b5d5e3fd6fc87 WatchSource:0}: Error finding container aa1426e0fa9932c208c6a9fe0bb6d22ba75ac9763f3fd5f0c69b5d5e3fd6fc87: Status 404 returned error can't find the container with id aa1426e0fa9932c208c6a9fe0bb6d22ba75ac9763f3fd5f0c69b5d5e3fd6fc87 Apr 16 18:32:59.068518 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:59.068490 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" event={"ID":"689117e1-30ad-4535-910e-895627fda928","Type":"ContainerStarted","Data":"aa1426e0fa9932c208c6a9fe0bb6d22ba75ac9763f3fd5f0c69b5d5e3fd6fc87"} Apr 16 18:32:59.069788 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:59.069757 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9s2zk" event={"ID":"c6f20480-57b7-41d9-b9fb-06c81c82803c","Type":"ContainerStarted","Data":"9677d27773cc25d7f247ccf705946a40ab4bf07718b53a243c9a3db2a6050fe6"} Apr 16 18:32:59.069886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:59.069793 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9s2zk" event={"ID":"c6f20480-57b7-41d9-b9fb-06c81c82803c","Type":"ContainerStarted","Data":"a2a34c7c90e54396b88c8c2c382c8a3f80b6a0e6a9cc28acee43963d3130acb6"} Apr 16 18:32:59.070041 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:59.070025 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:32:59.071147 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:32:59.071130 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5646c9b8dd-rn92g" Apr 16 18:33:00.075046 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:00.075009 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" event={"ID":"689117e1-30ad-4535-910e-895627fda928","Type":"ContainerStarted","Data":"a9c3f4bf7d479ffc06fe95cdf80687ef1b5036d22c7c21b9c6e6ef1083829ea4"} Apr 16 18:33:00.075461 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:00.075108 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:33:00.076591 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:00.076571 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9s2zk" event={"ID":"c6f20480-57b7-41d9-b9fb-06c81c82803c","Type":"ContainerStarted","Data":"24bee625a2bb19eb1ef5cc64a76c6b32f55390eaa093d5d1e4aeb23a4273479e"} Apr 16 18:33:00.099887 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:00.097500 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" podStartSLOduration=2.097482419 podStartE2EDuration="2.097482419s" podCreationTimestamp="2026-04-16 18:32:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:33:00.096046887 +0000 UTC m=+142.041477600" watchObservedRunningTime="2026-04-16 18:33:00.097482419 +0000 UTC m=+142.042913130" Apr 16 18:33:01.081015 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:01.080976 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9s2zk" event={"ID":"c6f20480-57b7-41d9-b9fb-06c81c82803c","Type":"ContainerStarted","Data":"a22a97300cb8dc1801067d0629ec238ad5579715a56975a4a2440d340bb5b465"} Apr 16 18:33:01.102125 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:01.102067 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9s2zk" podStartSLOduration=1.123664787 podStartE2EDuration="3.102046475s" podCreationTimestamp="2026-04-16 18:32:58 +0000 UTC" firstStartedPulling="2026-04-16 18:32:58.948305268 +0000 UTC m=+140.893735957" lastFinishedPulling="2026-04-16 18:33:00.926686958 +0000 UTC m=+142.872117645" observedRunningTime="2026-04-16 18:33:01.100544808 +0000 UTC m=+143.045975523" watchObservedRunningTime="2026-04-16 18:33:01.102046475 +0000 UTC m=+143.047477186" Apr 16 18:33:05.613892 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.613862 2577 scope.go:117] "RemoveContainer" containerID="b2ae6af12871c4ac9996e53aa34fb42dfcd6f94da46e926c627c4220c5f50bd8" Apr 16 18:33:05.614244 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:33:05.614068 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-8lld6_openshift-console-operator(6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" podUID="6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897" Apr 16 18:33:05.886069 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.885988 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6v66t"] Apr 16 18:33:05.889492 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.889468 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.892123 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.892099 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:33:05.892392 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.892373 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rxsj5\"" Apr 16 18:33:05.892494 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.892419 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:33:05.892494 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.892372 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:33:05.892860 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.892844 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:33:05.893294 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.893276 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:33:05.895554 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.895530 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:33:05.895738 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.895711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ebc2f20-5e6d-4dba-aa77-26f703020588-metrics-client-ca\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.895839 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.895750 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-textfile\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.895839 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.895830 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.895939 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.895893 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6ebc2f20-5e6d-4dba-aa77-26f703020588-root\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.895991 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.895936 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ebc2f20-5e6d-4dba-aa77-26f703020588-sys\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.895991 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.895968 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-accelerators-collector-config\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.896084 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.896003 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-tls\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.896084 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.896047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5ppv\" (UniqueName: \"kubernetes.io/projected/6ebc2f20-5e6d-4dba-aa77-26f703020588-kube-api-access-d5ppv\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.896179 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.896089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-wtmp\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.996505 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.996466 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-tls\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.996688 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.996534 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5ppv\" (UniqueName: \"kubernetes.io/projected/6ebc2f20-5e6d-4dba-aa77-26f703020588-kube-api-access-d5ppv\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.996688 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.996571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-wtmp\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.996688 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.996606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ebc2f20-5e6d-4dba-aa77-26f703020588-metrics-client-ca\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.996874 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.996750 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-wtmp\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.996874 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.996759 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-textfile\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.996874 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.996830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.997027 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.996887 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6ebc2f20-5e6d-4dba-aa77-26f703020588-root\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.997027 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.996937 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ebc2f20-5e6d-4dba-aa77-26f703020588-sys\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.997027 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.996965 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-accelerators-collector-config\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.997027 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.996998 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6ebc2f20-5e6d-4dba-aa77-26f703020588-root\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.997230 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.997052 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ebc2f20-5e6d-4dba-aa77-26f703020588-sys\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.997230 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.997092 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-textfile\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.997333 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.997291 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ebc2f20-5e6d-4dba-aa77-26f703020588-metrics-client-ca\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.997495 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.997473 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-accelerators-collector-config\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.999200 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.999176 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:05.999306 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:05.999236 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6ebc2f20-5e6d-4dba-aa77-26f703020588-node-exporter-tls\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:06.009535 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.009512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5ppv\" (UniqueName: \"kubernetes.io/projected/6ebc2f20-5e6d-4dba-aa77-26f703020588-kube-api-access-d5ppv\") pod \"node-exporter-6v66t\" (UID: \"6ebc2f20-5e6d-4dba-aa77-26f703020588\") " pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:06.199143 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.199062 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6v66t" Apr 16 18:33:06.206761 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:33:06.206737 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ebc2f20_5e6d_4dba_aa77_26f703020588.slice/crio-6bc47c5e73e957d449a9ae4136f5d14ab46a5b2e7e367a1a5fcf7069bc1cb183 WatchSource:0}: Error finding container 6bc47c5e73e957d449a9ae4136f5d14ab46a5b2e7e367a1a5fcf7069bc1cb183: Status 404 returned error can't find the container with id 6bc47c5e73e957d449a9ae4136f5d14ab46a5b2e7e367a1a5fcf7069bc1cb183 Apr 16 18:33:06.989314 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.989218 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:33:06.992388 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.992360 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:06.994711 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.994681 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:33:06.994935 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.994737 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:33:06.994935 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.994748 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:33:06.994935 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.994788 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:33:06.994935 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.994690 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:33:06.995288 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.995268 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:33:06.995408 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.995347 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:33:06.995569 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.995550 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:33:06.995679 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.995602 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:33:06.995736 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:06.995684 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-66k7k\"" Apr 16 18:33:07.004231 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.004209 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69197d1a-adb4-458a-9b05-1e0d33350333-config-out\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.004328 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.004258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69197d1a-adb4-458a-9b05-1e0d33350333-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.004328 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.004307 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.004429 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.004371 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.004429 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.004405 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69197d1a-adb4-458a-9b05-1e0d33350333-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.004529 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.004456 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.004529 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.004476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.004529 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.004495 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.004529 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.004523 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-config-volume\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.004732 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.004554 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8jp9\" (UniqueName: \"kubernetes.io/projected/69197d1a-adb4-458a-9b05-1e0d33350333-kube-api-access-x8jp9\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.004732 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.004638 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69197d1a-adb4-458a-9b05-1e0d33350333-tls-assets\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.004732 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.004654 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/69197d1a-adb4-458a-9b05-1e0d33350333-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.004732 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.004679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-web-config\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.007538 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.007518 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:33:07.097463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.097429 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6v66t" event={"ID":"6ebc2f20-5e6d-4dba-aa77-26f703020588","Type":"ContainerStarted","Data":"fdc83514c5d2d19f879901306d77ace2bf02e141514ce85cb99b304dd9523f5d"} Apr 16 18:33:07.097597 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.097481 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6v66t" event={"ID":"6ebc2f20-5e6d-4dba-aa77-26f703020588","Type":"ContainerStarted","Data":"6bc47c5e73e957d449a9ae4136f5d14ab46a5b2e7e367a1a5fcf7069bc1cb183"} Apr 16 18:33:07.109726 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.109698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69197d1a-adb4-458a-9b05-1e0d33350333-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.110012 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.109985 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.110128 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.110038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.110128 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.110074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.110128 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.110112 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-config-volume\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.110272 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.110144 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8jp9\" (UniqueName: \"kubernetes.io/projected/69197d1a-adb4-458a-9b05-1e0d33350333-kube-api-access-x8jp9\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.110355 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.110336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69197d1a-adb4-458a-9b05-1e0d33350333-tls-assets\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.110446 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.110400 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/69197d1a-adb4-458a-9b05-1e0d33350333-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.110655 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.110634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-web-config\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.110828 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.110809 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69197d1a-adb4-458a-9b05-1e0d33350333-config-out\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.111431 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.111407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69197d1a-adb4-458a-9b05-1e0d33350333-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.111521 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.111451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.111584 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.111518 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/69197d1a-adb4-458a-9b05-1e0d33350333-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.111641 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:33:07.110473 2577 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 18:33:07.111686 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:33:07.111660 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-main-tls podName:69197d1a-adb4-458a-9b05-1e0d33350333 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:07.611638286 +0000 UTC m=+149.557068999 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333") : secret "alertmanager-main-tls" not found Apr 16 18:33:07.112225 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.112196 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69197d1a-adb4-458a-9b05-1e0d33350333-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.112319 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.111524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.112662 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.112636 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.113073 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.113048 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-config-volume\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.113332 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.113311 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69197d1a-adb4-458a-9b05-1e0d33350333-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.113886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.113864 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69197d1a-adb4-458a-9b05-1e0d33350333-tls-assets\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.114144 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.114119 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.114426 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.114404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.114506 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.114487 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69197d1a-adb4-458a-9b05-1e0d33350333-config-out\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.115492 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.115470 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-web-config\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.115581 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.115563 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.119673 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.119650 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8jp9\" (UniqueName: \"kubernetes.io/projected/69197d1a-adb4-458a-9b05-1e0d33350333-kube-api-access-x8jp9\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.616836 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.616792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.619320 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.619284 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:07.917359 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:07.917260 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:08.048379 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:08.048355 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:33:08.051272 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:33:08.051245 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69197d1a_adb4_458a_9b05_1e0d33350333.slice/crio-bc6a773569fcd90a53ad057130449a2fa0011c338a70549094dc4b7f4b4211a4 WatchSource:0}: Error finding container bc6a773569fcd90a53ad057130449a2fa0011c338a70549094dc4b7f4b4211a4: Status 404 returned error can't find the container with id bc6a773569fcd90a53ad057130449a2fa0011c338a70549094dc4b7f4b4211a4 Apr 16 18:33:08.101544 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:08.101515 2577 generic.go:358] "Generic (PLEG): container finished" podID="6ebc2f20-5e6d-4dba-aa77-26f703020588" containerID="fdc83514c5d2d19f879901306d77ace2bf02e141514ce85cb99b304dd9523f5d" exitCode=0 Apr 16 18:33:08.101677 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:08.101590 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6v66t" event={"ID":"6ebc2f20-5e6d-4dba-aa77-26f703020588","Type":"ContainerDied","Data":"fdc83514c5d2d19f879901306d77ace2bf02e141514ce85cb99b304dd9523f5d"} Apr 16 18:33:08.102550 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:08.102525 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerStarted","Data":"bc6a773569fcd90a53ad057130449a2fa0011c338a70549094dc4b7f4b4211a4"} Apr 16 18:33:09.106752 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.106671 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6v66t" event={"ID":"6ebc2f20-5e6d-4dba-aa77-26f703020588","Type":"ContainerStarted","Data":"c18cb56b2c29039bf3d27b3f5780ab7b9faeabcdb6dae77d4dcee96ab45eae52"} Apr 16 18:33:09.106752 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.106718 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6v66t" event={"ID":"6ebc2f20-5e6d-4dba-aa77-26f703020588","Type":"ContainerStarted","Data":"4fd686cf1b33abfefe81ce1ab7868a7269d05a8b3d65c8c5dab465eecb743ab7"} Apr 16 18:33:09.108126 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.108105 2577 generic.go:358] "Generic (PLEG): container finished" podID="69197d1a-adb4-458a-9b05-1e0d33350333" containerID="15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8" exitCode=0 Apr 16 18:33:09.108229 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.108155 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerDied","Data":"15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8"} Apr 16 18:33:09.162795 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.162719 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6v66t" podStartSLOduration=3.41388689 podStartE2EDuration="4.162703976s" podCreationTimestamp="2026-04-16 18:33:05 +0000 UTC" firstStartedPulling="2026-04-16 18:33:06.208417115 +0000 UTC m=+148.153847818" lastFinishedPulling="2026-04-16 18:33:06.957234213 +0000 UTC m=+148.902664904" observedRunningTime="2026-04-16 18:33:09.128017833 +0000 UTC m=+151.073448544" watchObservedRunningTime="2026-04-16 18:33:09.162703976 +0000 UTC m=+151.108134687" Apr 16 18:33:09.885419 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.885376 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6457777b56-twkd8"] Apr 16 18:33:09.888323 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.888302 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:09.890811 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.890791 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 18:33:09.890981 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.890955 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 18:33:09.891083 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.890962 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 18:33:09.891083 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.891004 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 18:33:09.891083 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.891030 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-wbnqb\"" Apr 16 18:33:09.891228 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.891170 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 18:33:09.891492 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.891475 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-8htco6s3jjqf4\"" Apr 16 18:33:09.899127 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.899106 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6457777b56-twkd8"] Apr 16 18:33:09.941887 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.941854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:09.942044 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.941915 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:09.942044 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.941949 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:09.942044 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.941976 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-grpc-tls\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:09.942044 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.942009 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:09.942200 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.942133 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-tls\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:09.942200 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.942177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkz2t\" (UniqueName: \"kubernetes.io/projected/c63c44d1-e7f6-4c84-a570-20f255e769fd-kube-api-access-mkz2t\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:09.942279 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:09.942229 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c63c44d1-e7f6-4c84-a570-20f255e769fd-metrics-client-ca\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.042876 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.042830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.043073 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.042935 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-tls\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.043073 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.042974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkz2t\" (UniqueName: \"kubernetes.io/projected/c63c44d1-e7f6-4c84-a570-20f255e769fd-kube-api-access-mkz2t\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.043073 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.043014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c63c44d1-e7f6-4c84-a570-20f255e769fd-metrics-client-ca\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.043073 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.043033 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.043073 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.043060 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.043339 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.043079 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.043339 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.043095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-grpc-tls\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.044168 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.044139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c63c44d1-e7f6-4c84-a570-20f255e769fd-metrics-client-ca\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.046599 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.046574 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-grpc-tls\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.047097 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.047053 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-tls\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.047301 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.047278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.047374 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.047276 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.047606 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.047583 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.047675 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.047655 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c63c44d1-e7f6-4c84-a570-20f255e769fd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.052181 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.052161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkz2t\" (UniqueName: \"kubernetes.io/projected/c63c44d1-e7f6-4c84-a570-20f255e769fd-kube-api-access-mkz2t\") pod \"thanos-querier-6457777b56-twkd8\" (UID: \"c63c44d1-e7f6-4c84-a570-20f255e769fd\") " pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.199461 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.199369 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:10.348476 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.348454 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6457777b56-twkd8"] Apr 16 18:33:10.352865 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:33:10.352840 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc63c44d1_e7f6_4c84_a570_20f255e769fd.slice/crio-574c3ef0c00dea02e687eb752a6d31cd30da7b35fd0666244717dcd3da33c90b WatchSource:0}: Error finding container 574c3ef0c00dea02e687eb752a6d31cd30da7b35fd0666244717dcd3da33c90b: Status 404 returned error can't find the container with id 574c3ef0c00dea02e687eb752a6d31cd30da7b35fd0666244717dcd3da33c90b Apr 16 18:33:10.379892 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.379869 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5466b5fd47-bm2b5"] Apr 16 18:33:10.382125 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.382108 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.384629 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.384555 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:33:10.384629 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.384566 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-jfhsk\"" Apr 16 18:33:10.384629 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.384566 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-537u00sn8dq49\"" Apr 16 18:33:10.384629 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.384603 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 18:33:10.384863 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.384555 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 18:33:10.384863 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.384570 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 18:33:10.392939 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.392921 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5466b5fd47-bm2b5"] Apr 16 18:33:10.446741 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.446718 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/dcc3d5f1-1012-45bf-9057-4522e50a3623-metrics-server-audit-profiles\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.446830 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.446755 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcc3d5f1-1012-45bf-9057-4522e50a3623-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.446830 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.446802 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc3d5f1-1012-45bf-9057-4522e50a3623-client-ca-bundle\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.446902 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.446828 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/dcc3d5f1-1012-45bf-9057-4522e50a3623-audit-log\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.447020 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.446987 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/dcc3d5f1-1012-45bf-9057-4522e50a3623-secret-metrics-server-tls\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.447077 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.447056 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/dcc3d5f1-1012-45bf-9057-4522e50a3623-secret-metrics-server-client-certs\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.447167 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.447086 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4mtw\" (UniqueName: \"kubernetes.io/projected/dcc3d5f1-1012-45bf-9057-4522e50a3623-kube-api-access-q4mtw\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.548727 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.548698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/dcc3d5f1-1012-45bf-9057-4522e50a3623-secret-metrics-server-client-certs\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.548848 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.548742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4mtw\" (UniqueName: \"kubernetes.io/projected/dcc3d5f1-1012-45bf-9057-4522e50a3623-kube-api-access-q4mtw\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.548848 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.548830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/dcc3d5f1-1012-45bf-9057-4522e50a3623-metrics-server-audit-profiles\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.548963 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.548867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcc3d5f1-1012-45bf-9057-4522e50a3623-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.548963 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.548896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc3d5f1-1012-45bf-9057-4522e50a3623-client-ca-bundle\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.548963 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.548921 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/dcc3d5f1-1012-45bf-9057-4522e50a3623-audit-log\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.549103 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.549033 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/dcc3d5f1-1012-45bf-9057-4522e50a3623-secret-metrics-server-tls\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.549422 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.549376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/dcc3d5f1-1012-45bf-9057-4522e50a3623-audit-log\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.549912 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.549884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcc3d5f1-1012-45bf-9057-4522e50a3623-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.550975 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.550935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/dcc3d5f1-1012-45bf-9057-4522e50a3623-metrics-server-audit-profiles\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.551564 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.551508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/dcc3d5f1-1012-45bf-9057-4522e50a3623-secret-metrics-server-client-certs\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.551692 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.551674 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc3d5f1-1012-45bf-9057-4522e50a3623-client-ca-bundle\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.552012 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.551996 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/dcc3d5f1-1012-45bf-9057-4522e50a3623-secret-metrics-server-tls\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.556889 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.556867 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4mtw\" (UniqueName: \"kubernetes.io/projected/dcc3d5f1-1012-45bf-9057-4522e50a3623-kube-api-access-q4mtw\") pod \"metrics-server-5466b5fd47-bm2b5\" (UID: \"dcc3d5f1-1012-45bf-9057-4522e50a3623\") " pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.692568 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.692536 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:10.814351 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:10.814325 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5466b5fd47-bm2b5"] Apr 16 18:33:10.816581 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:33:10.816552 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcc3d5f1_1012_45bf_9057_4522e50a3623.slice/crio-ef6208f8c8c984a3c55777d15f95b13fcd8eff6cf12a6068472b7ecde2455653 WatchSource:0}: Error finding container ef6208f8c8c984a3c55777d15f95b13fcd8eff6cf12a6068472b7ecde2455653: Status 404 returned error can't find the container with id ef6208f8c8c984a3c55777d15f95b13fcd8eff6cf12a6068472b7ecde2455653 Apr 16 18:33:11.119643 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:11.119605 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerStarted","Data":"9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389"} Apr 16 18:33:11.119858 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:11.119652 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerStarted","Data":"ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9"} Apr 16 18:33:11.119858 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:11.119664 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerStarted","Data":"41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558"} Apr 16 18:33:11.119858 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:11.119674 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerStarted","Data":"0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7"} Apr 16 18:33:11.119858 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:11.119681 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerStarted","Data":"7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b"} Apr 16 18:33:11.120922 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:11.120890 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" event={"ID":"c63c44d1-e7f6-4c84-a570-20f255e769fd","Type":"ContainerStarted","Data":"574c3ef0c00dea02e687eb752a6d31cd30da7b35fd0666244717dcd3da33c90b"} Apr 16 18:33:11.121997 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:11.121970 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" event={"ID":"dcc3d5f1-1012-45bf-9057-4522e50a3623","Type":"ContainerStarted","Data":"ef6208f8c8c984a3c55777d15f95b13fcd8eff6cf12a6068472b7ecde2455653"} Apr 16 18:33:12.128000 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:12.127965 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerStarted","Data":"50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5"} Apr 16 18:33:12.159520 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:12.159467 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.887024769 podStartE2EDuration="6.159451519s" podCreationTimestamp="2026-04-16 18:33:06 +0000 UTC" firstStartedPulling="2026-04-16 18:33:08.053120697 +0000 UTC m=+149.998551385" lastFinishedPulling="2026-04-16 18:33:11.325547424 +0000 UTC m=+153.270978135" observedRunningTime="2026-04-16 18:33:12.158435485 +0000 UTC m=+154.103866233" watchObservedRunningTime="2026-04-16 18:33:12.159451519 +0000 UTC m=+154.104882228" Apr 16 18:33:13.133358 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:13.133326 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" event={"ID":"dcc3d5f1-1012-45bf-9057-4522e50a3623","Type":"ContainerStarted","Data":"c72a60dae5fb1939fa73e20ae500723d805e30d563f4c95e1f9e7ec70be3362c"} Apr 16 18:33:13.135919 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:13.135890 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" event={"ID":"c63c44d1-e7f6-4c84-a570-20f255e769fd","Type":"ContainerStarted","Data":"6e5b6a2b9a2c5e6ee6bfcc782c88000c7c335221a67bd0eca14a98cd7d0bbee2"} Apr 16 18:33:13.135919 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:13.135927 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" event={"ID":"c63c44d1-e7f6-4c84-a570-20f255e769fd","Type":"ContainerStarted","Data":"6e7660f579b1007678a28fbaa5d445465ea86ed304011dff2f0b015663937e4b"} Apr 16 18:33:13.136078 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:13.135940 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" event={"ID":"c63c44d1-e7f6-4c84-a570-20f255e769fd","Type":"ContainerStarted","Data":"861eef90e6377eee1e75d18eedb3d12928f24127f6b039e10c0f8651a5e0bc08"} Apr 16 18:33:13.136078 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:13.135953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" event={"ID":"c63c44d1-e7f6-4c84-a570-20f255e769fd","Type":"ContainerStarted","Data":"46874040e852040c7f9725ad900dc5dc348f6964d52b8c01624d9462cf4ff06b"} Apr 16 18:33:13.136078 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:13.135964 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" event={"ID":"c63c44d1-e7f6-4c84-a570-20f255e769fd","Type":"ContainerStarted","Data":"d9b9acd417282d0dcd0a8e99aa175374cf48ec76ac5d8959946dc666f32f5cb1"} Apr 16 18:33:13.136078 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:13.135978 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" event={"ID":"c63c44d1-e7f6-4c84-a570-20f255e769fd","Type":"ContainerStarted","Data":"a0a9ee20d441535905536766f4bc8e6f5c02779573579526c9b3f2bc424fd743"} Apr 16 18:33:13.153334 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:13.153288 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" podStartSLOduration=1.432076556 podStartE2EDuration="3.153273514s" podCreationTimestamp="2026-04-16 18:33:10 +0000 UTC" firstStartedPulling="2026-04-16 18:33:10.818632249 +0000 UTC m=+152.764062937" lastFinishedPulling="2026-04-16 18:33:12.539829202 +0000 UTC m=+154.485259895" observedRunningTime="2026-04-16 18:33:13.151023627 +0000 UTC m=+155.096454350" watchObservedRunningTime="2026-04-16 18:33:13.153273514 +0000 UTC m=+155.098704224" Apr 16 18:33:13.175059 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:13.175012 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" podStartSLOduration=1.990253255 podStartE2EDuration="4.174998585s" podCreationTimestamp="2026-04-16 18:33:09 +0000 UTC" firstStartedPulling="2026-04-16 18:33:10.354798239 +0000 UTC m=+152.300228947" lastFinishedPulling="2026-04-16 18:33:12.539543588 +0000 UTC m=+154.484974277" observedRunningTime="2026-04-16 18:33:13.173680794 +0000 UTC m=+155.119111504" watchObservedRunningTime="2026-04-16 18:33:13.174998585 +0000 UTC m=+155.120429295" Apr 16 18:33:13.973844 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:33:13.973794 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ldpjc" podUID="2d4ed685-8585-4063-a50d-bab899fa550e" Apr 16 18:33:13.995956 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:33:13.995929 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-mstsd" podUID="5276ac45-8e09-409e-989a-d2ebdd40a1a4" Apr 16 18:33:14.139536 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:14.139505 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ldpjc" Apr 16 18:33:14.140001 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:14.139724 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:18.618143 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:18.617841 2577 scope.go:117] "RemoveContainer" containerID="b2ae6af12871c4ac9996e53aa34fb42dfcd6f94da46e926c627c4220c5f50bd8" Apr 16 18:33:18.826132 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:18.826098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:33:18.828426 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:18.828395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d4ed685-8585-4063-a50d-bab899fa550e-metrics-tls\") pod \"dns-default-ldpjc\" (UID: \"2d4ed685-8585-4063-a50d-bab899fa550e\") " pod="openshift-dns/dns-default-ldpjc" Apr 16 18:33:18.841029 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:18.840992 2577 patch_prober.go:28] interesting pod/image-registry-84b8f69b7d-8pb5p container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:33:18.841114 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:18.841055 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" podUID="689117e1-30ad-4535-910e-895627fda928" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:18.927579 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:18.927501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:33:18.929862 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:18.929824 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5276ac45-8e09-409e-989a-d2ebdd40a1a4-cert\") pod \"ingress-canary-mstsd\" (UID: \"5276ac45-8e09-409e-989a-d2ebdd40a1a4\") " pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:33:18.943176 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:18.943154 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-h4pkb\"" Apr 16 18:33:18.950898 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:18.950880 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ldpjc" Apr 16 18:33:19.071407 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:19.071375 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ldpjc"] Apr 16 18:33:19.074596 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:33:19.074569 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d4ed685_8585_4063_a50d_bab899fa550e.slice/crio-7534d410c531267ac13509c2fda13a74f0470837f00acbad2dc8f7d1ea959ccd WatchSource:0}: Error finding container 7534d410c531267ac13509c2fda13a74f0470837f00acbad2dc8f7d1ea959ccd: Status 404 returned error can't find the container with id 7534d410c531267ac13509c2fda13a74f0470837f00acbad2dc8f7d1ea959ccd Apr 16 18:33:19.157177 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:19.157144 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ldpjc" event={"ID":"2d4ed685-8585-4063-a50d-bab899fa550e","Type":"ContainerStarted","Data":"7534d410c531267ac13509c2fda13a74f0470837f00acbad2dc8f7d1ea959ccd"} Apr 16 18:33:19.158776 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:19.158742 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:33:19.158879 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:19.158831 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" event={"ID":"6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897","Type":"ContainerStarted","Data":"50aeb6d952c10b082fc9101f4b3cd87a6475da40714bf316a59970ce1cd35edf"} Apr 16 18:33:19.159138 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:19.159114 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:33:19.177740 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:19.177420 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" podStartSLOduration=52.704174578 podStartE2EDuration="55.177404159s" podCreationTimestamp="2026-04-16 18:32:24 +0000 UTC" firstStartedPulling="2026-04-16 18:32:25.301279163 +0000 UTC m=+107.246709851" lastFinishedPulling="2026-04-16 18:32:27.774508737 +0000 UTC m=+109.719939432" observedRunningTime="2026-04-16 18:33:19.176806235 +0000 UTC m=+161.122236944" watchObservedRunningTime="2026-04-16 18:33:19.177404159 +0000 UTC m=+161.122834864" Apr 16 18:33:19.753486 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:19.753457 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-8lld6" Apr 16 18:33:19.950118 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:19.950087 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-fj9jn"] Apr 16 18:33:19.953623 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:19.953600 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-fj9jn" Apr 16 18:33:19.957053 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:19.956932 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-krvz9\"" Apr 16 18:33:19.957153 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:19.957083 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:33:19.957696 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:19.957672 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:33:19.969691 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:19.969654 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-fj9jn"] Apr 16 18:33:20.037908 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:20.037815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smgtm\" (UniqueName: \"kubernetes.io/projected/7a21faa0-d8d7-438e-a753-354f55344b61-kube-api-access-smgtm\") pod \"downloads-586b57c7b4-fj9jn\" (UID: \"7a21faa0-d8d7-438e-a753-354f55344b61\") " pod="openshift-console/downloads-586b57c7b4-fj9jn" Apr 16 18:33:20.138723 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:20.138693 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smgtm\" (UniqueName: \"kubernetes.io/projected/7a21faa0-d8d7-438e-a753-354f55344b61-kube-api-access-smgtm\") pod \"downloads-586b57c7b4-fj9jn\" (UID: \"7a21faa0-d8d7-438e-a753-354f55344b61\") " pod="openshift-console/downloads-586b57c7b4-fj9jn" Apr 16 18:33:20.147874 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:20.147801 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smgtm\" (UniqueName: \"kubernetes.io/projected/7a21faa0-d8d7-438e-a753-354f55344b61-kube-api-access-smgtm\") pod \"downloads-586b57c7b4-fj9jn\" (UID: \"7a21faa0-d8d7-438e-a753-354f55344b61\") " pod="openshift-console/downloads-586b57c7b4-fj9jn" Apr 16 18:33:20.150248 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:20.150224 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6457777b56-twkd8" Apr 16 18:33:20.264676 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:20.264645 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-fj9jn" Apr 16 18:33:20.405105 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:20.405082 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-fj9jn"] Apr 16 18:33:20.407901 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:33:20.407874 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a21faa0_d8d7_438e_a753_354f55344b61.slice/crio-e8336571713134e0eff625e16689a59bd886bc61b6ea94d76766bee4e1dc528b WatchSource:0}: Error finding container e8336571713134e0eff625e16689a59bd886bc61b6ea94d76766bee4e1dc528b: Status 404 returned error can't find the container with id e8336571713134e0eff625e16689a59bd886bc61b6ea94d76766bee4e1dc528b Apr 16 18:33:21.087075 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:21.086857 2577 patch_prober.go:28] interesting pod/image-registry-84b8f69b7d-8pb5p container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:33:21.087075 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:21.086932 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" podUID="689117e1-30ad-4535-910e-895627fda928" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:21.168294 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:21.168255 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ldpjc" event={"ID":"2d4ed685-8585-4063-a50d-bab899fa550e","Type":"ContainerStarted","Data":"50bd60e676ca27d77ac6377e629818fa2c2c0d5cce6f4aca1e3f16bf541caa59"} Apr 16 18:33:21.168294 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:21.168297 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ldpjc" event={"ID":"2d4ed685-8585-4063-a50d-bab899fa550e","Type":"ContainerStarted","Data":"81eb399a7eb0a6cf6829c29144aaac5a023582852c58e1b600f7a362660d3956"} Apr 16 18:33:21.168527 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:21.168402 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ldpjc" Apr 16 18:33:21.169564 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:21.169522 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-fj9jn" event={"ID":"7a21faa0-d8d7-438e-a753-354f55344b61","Type":"ContainerStarted","Data":"e8336571713134e0eff625e16689a59bd886bc61b6ea94d76766bee4e1dc528b"} Apr 16 18:33:21.188635 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:21.188585 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ldpjc" podStartSLOduration=129.987753894 podStartE2EDuration="2m11.188568438s" podCreationTimestamp="2026-04-16 18:31:10 +0000 UTC" firstStartedPulling="2026-04-16 18:33:19.076642467 +0000 UTC m=+161.022073158" lastFinishedPulling="2026-04-16 18:33:20.277456997 +0000 UTC m=+162.222887702" observedRunningTime="2026-04-16 18:33:21.187212493 +0000 UTC m=+163.132643206" watchObservedRunningTime="2026-04-16 18:33:21.188568438 +0000 UTC m=+163.133999149" Apr 16 18:33:25.613986 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:25.613952 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:33:25.616648 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:25.616625 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2pvvf\"" Apr 16 18:33:25.624814 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:25.624793 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mstsd" Apr 16 18:33:25.758861 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:25.758834 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mstsd"] Apr 16 18:33:25.761945 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:33:25.761909 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5276ac45_8e09_409e_989a_d2ebdd40a1a4.slice/crio-a69fcd660da11224853c45e320a1f67c5d4d6aedf0d895d1e90756367161d0f7 WatchSource:0}: Error finding container a69fcd660da11224853c45e320a1f67c5d4d6aedf0d895d1e90756367161d0f7: Status 404 returned error can't find the container with id a69fcd660da11224853c45e320a1f67c5d4d6aedf0d895d1e90756367161d0f7 Apr 16 18:33:26.193094 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:26.193057 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mstsd" event={"ID":"5276ac45-8e09-409e-989a-d2ebdd40a1a4","Type":"ContainerStarted","Data":"a69fcd660da11224853c45e320a1f67c5d4d6aedf0d895d1e90756367161d0f7"} Apr 16 18:33:28.202210 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:28.202175 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mstsd" event={"ID":"5276ac45-8e09-409e-989a-d2ebdd40a1a4","Type":"ContainerStarted","Data":"c6f9c4bbc83a12599e4d542105beee1f2c8421fd3b7bcb2ad715ffb23cc8879d"} Apr 16 18:33:28.223239 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:28.223189 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mstsd" podStartSLOduration=136.619105734 podStartE2EDuration="2m18.223172711s" podCreationTimestamp="2026-04-16 18:31:10 +0000 UTC" firstStartedPulling="2026-04-16 18:33:25.764231926 +0000 UTC m=+167.709662634" lastFinishedPulling="2026-04-16 18:33:27.368298922 +0000 UTC m=+169.313729611" observedRunningTime="2026-04-16 18:33:28.221567083 +0000 UTC m=+170.166997816" watchObservedRunningTime="2026-04-16 18:33:28.223172711 +0000 UTC m=+170.168603454" Apr 16 18:33:28.840749 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:28.840716 2577 patch_prober.go:28] interesting pod/image-registry-84b8f69b7d-8pb5p container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:33:28.840927 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:28.840788 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" podUID="689117e1-30ad-4535-910e-895627fda928" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:29.078144 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.078112 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66b8666969-kdwvs"] Apr 16 18:33:29.082523 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.082500 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.085309 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.085282 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:33:29.085452 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.085424 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:33:29.085586 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.085546 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:33:29.085655 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.085628 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:33:29.085735 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.085718 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:33:29.086053 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.086035 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jczkq\"" Apr 16 18:33:29.096488 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.096434 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66b8666969-kdwvs"] Apr 16 18:33:29.227263 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.227231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-console-config\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.227696 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.227276 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/330da926-d1e5-49b9-b00b-8c71db5b4276-console-oauth-config\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.227696 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.227351 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-service-ca\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.227696 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.227380 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkhqm\" (UniqueName: \"kubernetes.io/projected/330da926-d1e5-49b9-b00b-8c71db5b4276-kube-api-access-jkhqm\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.227696 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.227421 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-oauth-serving-cert\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.227696 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.227450 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/330da926-d1e5-49b9-b00b-8c71db5b4276-console-serving-cert\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.327987 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.327952 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-console-config\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.328185 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.328012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/330da926-d1e5-49b9-b00b-8c71db5b4276-console-oauth-config\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.328185 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.328056 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-service-ca\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.328185 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.328083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkhqm\" (UniqueName: \"kubernetes.io/projected/330da926-d1e5-49b9-b00b-8c71db5b4276-kube-api-access-jkhqm\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.328185 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.328119 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-oauth-serving-cert\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.328185 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.328148 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/330da926-d1e5-49b9-b00b-8c71db5b4276-console-serving-cert\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.328739 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.328707 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-console-config\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.328944 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.328914 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-service-ca\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.329162 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.329139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-oauth-serving-cert\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.330948 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.330927 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/330da926-d1e5-49b9-b00b-8c71db5b4276-console-oauth-config\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.331149 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.331131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/330da926-d1e5-49b9-b00b-8c71db5b4276-console-serving-cert\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.341177 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.341153 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkhqm\" (UniqueName: \"kubernetes.io/projected/330da926-d1e5-49b9-b00b-8c71db5b4276-kube-api-access-jkhqm\") pod \"console-66b8666969-kdwvs\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:29.393919 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:29.393827 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:30.693538 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:30.693494 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:30.694028 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:30.693576 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:31.085858 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:31.085804 2577 patch_prober.go:28] interesting pod/image-registry-84b8f69b7d-8pb5p container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:33:31.086046 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:31.085876 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" podUID="689117e1-30ad-4535-910e-895627fda928" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:31.176102 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:31.176068 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ldpjc" Apr 16 18:33:35.862722 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:35.862690 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66b8666969-kdwvs"] Apr 16 18:33:35.867895 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:33:35.867860 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod330da926_d1e5_49b9_b00b_8c71db5b4276.slice/crio-6380e9d9dc8ed8ff57277a1cfdc1d3b806f9dd9063283bd6cb98d08f0494868f WatchSource:0}: Error finding container 6380e9d9dc8ed8ff57277a1cfdc1d3b806f9dd9063283bd6cb98d08f0494868f: Status 404 returned error can't find the container with id 6380e9d9dc8ed8ff57277a1cfdc1d3b806f9dd9063283bd6cb98d08f0494868f Apr 16 18:33:36.232004 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:36.231957 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66b8666969-kdwvs" event={"ID":"330da926-d1e5-49b9-b00b-8c71db5b4276","Type":"ContainerStarted","Data":"6380e9d9dc8ed8ff57277a1cfdc1d3b806f9dd9063283bd6cb98d08f0494868f"} Apr 16 18:33:36.233583 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:36.233548 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-fj9jn" event={"ID":"7a21faa0-d8d7-438e-a753-354f55344b61","Type":"ContainerStarted","Data":"a19d3849f9fcb40c6e4124c553c41636763d7d5e3a530de03d01ba451c1e1f22"} Apr 16 18:33:36.233820 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:36.233803 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-fj9jn" Apr 16 18:33:36.253917 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:36.253863 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-fj9jn" podStartSLOduration=1.839715859 podStartE2EDuration="17.253848654s" podCreationTimestamp="2026-04-16 18:33:19 +0000 UTC" firstStartedPulling="2026-04-16 18:33:20.409716761 +0000 UTC m=+162.355147452" lastFinishedPulling="2026-04-16 18:33:35.823849539 +0000 UTC m=+177.769280247" observedRunningTime="2026-04-16 18:33:36.252278991 +0000 UTC m=+178.197709700" watchObservedRunningTime="2026-04-16 18:33:36.253848654 +0000 UTC m=+178.199279360" Apr 16 18:33:36.254391 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:36.254364 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-fj9jn" Apr 16 18:33:38.242799 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.241293 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-df8555c5b-spxfk"] Apr 16 18:33:38.279072 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.279013 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-df8555c5b-spxfk"] Apr 16 18:33:38.279236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.279185 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.287923 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.287894 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:33:38.414735 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.414694 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-service-ca\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.414946 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.414788 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-trusted-ca-bundle\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.414946 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.414830 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-oauth-serving-cert\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.414946 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.414860 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m86bb\" (UniqueName: \"kubernetes.io/projected/7e726afe-85e3-44c2-b794-d40597ecc578-kube-api-access-m86bb\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.414946 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.414916 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e726afe-85e3-44c2-b794-d40597ecc578-console-serving-cert\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.415172 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.414953 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7e726afe-85e3-44c2-b794-d40597ecc578-console-oauth-config\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.415172 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.415013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-console-config\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.515870 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.515398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e726afe-85e3-44c2-b794-d40597ecc578-console-serving-cert\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.515870 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.515451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7e726afe-85e3-44c2-b794-d40597ecc578-console-oauth-config\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.515870 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.515511 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-console-config\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.515870 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.515741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-service-ca\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.515870 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.515847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-trusted-ca-bundle\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.516199 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.515900 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-oauth-serving-cert\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.516199 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.515929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m86bb\" (UniqueName: \"kubernetes.io/projected/7e726afe-85e3-44c2-b794-d40597ecc578-kube-api-access-m86bb\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.519397 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.519327 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-console-config\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.519397 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.519333 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e726afe-85e3-44c2-b794-d40597ecc578-console-serving-cert\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.519397 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.519362 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-service-ca\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.519905 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.519861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-oauth-serving-cert\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.520138 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.520101 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7e726afe-85e3-44c2-b794-d40597ecc578-console-oauth-config\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.527032 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.527013 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:33:38.527617 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.527592 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m86bb\" (UniqueName: \"kubernetes.io/projected/7e726afe-85e3-44c2-b794-d40597ecc578-kube-api-access-m86bb\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.528467 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.528424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-trusted-ca-bundle\") pod \"console-df8555c5b-spxfk\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.593548 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.593497 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:38.842733 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.842681 2577 patch_prober.go:28] interesting pod/image-registry-84b8f69b7d-8pb5p container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:33:38.842928 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.842782 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" podUID="689117e1-30ad-4535-910e-895627fda928" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:38.842928 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.842839 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:33:38.843434 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.843387 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"a9c3f4bf7d479ffc06fe95cdf80687ef1b5036d22c7c21b9c6e6ef1083829ea4"} pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" containerMessage="Container registry failed liveness probe, will be restarted" Apr 16 18:33:38.848484 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.848437 2577 patch_prober.go:28] interesting pod/image-registry-84b8f69b7d-8pb5p container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:33:38.848603 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:38.848489 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" podUID="689117e1-30ad-4535-910e-895627fda928" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:39.162057 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:39.162012 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-df8555c5b-spxfk"] Apr 16 18:33:39.165735 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:33:39.165705 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e726afe_85e3_44c2_b794_d40597ecc578.slice/crio-c1d788d6c80ce76d0de10ae3fc4889cae70539cc8e64be2c40c466e4145418b7 WatchSource:0}: Error finding container c1d788d6c80ce76d0de10ae3fc4889cae70539cc8e64be2c40c466e4145418b7: Status 404 returned error can't find the container with id c1d788d6c80ce76d0de10ae3fc4889cae70539cc8e64be2c40c466e4145418b7 Apr 16 18:33:39.247316 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:39.247223 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df8555c5b-spxfk" event={"ID":"7e726afe-85e3-44c2-b794-d40597ecc578","Type":"ContainerStarted","Data":"c1d788d6c80ce76d0de10ae3fc4889cae70539cc8e64be2c40c466e4145418b7"} Apr 16 18:33:40.252958 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:40.252860 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66b8666969-kdwvs" event={"ID":"330da926-d1e5-49b9-b00b-8c71db5b4276","Type":"ContainerStarted","Data":"62f01e72cb75f57a06eaad467b12e8df6a8a5e8a7535a3bd7ef650f65618f850"} Apr 16 18:33:40.254628 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:40.254597 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df8555c5b-spxfk" event={"ID":"7e726afe-85e3-44c2-b794-d40597ecc578","Type":"ContainerStarted","Data":"a285afdb650eb83ca3e3cf05bd49f2f9d3bd9a805c86af17b1107ef12410f034"} Apr 16 18:33:40.271377 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:40.271320 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66b8666969-kdwvs" podStartSLOduration=7.641736817 podStartE2EDuration="11.271306277s" podCreationTimestamp="2026-04-16 18:33:29 +0000 UTC" firstStartedPulling="2026-04-16 18:33:35.870033334 +0000 UTC m=+177.815464026" lastFinishedPulling="2026-04-16 18:33:39.499602786 +0000 UTC m=+181.445033486" observedRunningTime="2026-04-16 18:33:40.27031757 +0000 UTC m=+182.215748316" watchObservedRunningTime="2026-04-16 18:33:40.271306277 +0000 UTC m=+182.216736998" Apr 16 18:33:40.291071 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:40.291020 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-df8555c5b-spxfk" podStartSLOduration=1.617154007 podStartE2EDuration="2.291003736s" podCreationTimestamp="2026-04-16 18:33:38 +0000 UTC" firstStartedPulling="2026-04-16 18:33:39.168325688 +0000 UTC m=+181.113756377" lastFinishedPulling="2026-04-16 18:33:39.842175415 +0000 UTC m=+181.787606106" observedRunningTime="2026-04-16 18:33:40.288543504 +0000 UTC m=+182.233974215" watchObservedRunningTime="2026-04-16 18:33:40.291003736 +0000 UTC m=+182.236434449" Apr 16 18:33:48.594129 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:48.594100 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:48.594904 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:48.594141 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:48.600216 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:48.600196 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:48.847228 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:48.847154 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:33:49.290044 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:49.290018 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:33:49.365236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:49.365206 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66b8666969-kdwvs"] Apr 16 18:33:49.394480 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:49.394445 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:33:50.291173 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:50.291142 2577 generic.go:358] "Generic (PLEG): container finished" podID="a93e4910-3556-4922-b202-bb6fcadfd443" containerID="bb2ae1610c75166be92bde3361da2ddd03263ea39fdc7bc8390dd3594fad461c" exitCode=0 Apr 16 18:33:50.291530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:50.291211 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" event={"ID":"a93e4910-3556-4922-b202-bb6fcadfd443","Type":"ContainerDied","Data":"bb2ae1610c75166be92bde3361da2ddd03263ea39fdc7bc8390dd3594fad461c"} Apr 16 18:33:50.291530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:50.291513 2577 scope.go:117] "RemoveContainer" containerID="bb2ae1610c75166be92bde3361da2ddd03263ea39fdc7bc8390dd3594fad461c" Apr 16 18:33:50.698818 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:50.698790 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:50.702668 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:50.702643 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5466b5fd47-bm2b5" Apr 16 18:33:51.296028 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:33:51.295993 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zbl97" event={"ID":"a93e4910-3556-4922-b202-bb6fcadfd443","Type":"ContainerStarted","Data":"45b193d5b3b8187e227579df372fa8c31a7440cabfaaf3de14d92ce6c542a57b"} Apr 16 18:34:02.327661 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:02.327628 2577 generic.go:358] "Generic (PLEG): container finished" podID="6514a34b-67e6-4daf-a518-91f2a7316066" containerID="29b583d0f36c8946abbf035315b9c7d82ba7121ffcc11cc9a1289956a1bd45e5" exitCode=0 Apr 16 18:34:02.328151 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:02.327671 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" event={"ID":"6514a34b-67e6-4daf-a518-91f2a7316066","Type":"ContainerDied","Data":"29b583d0f36c8946abbf035315b9c7d82ba7121ffcc11cc9a1289956a1bd45e5"} Apr 16 18:34:02.328151 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:02.327959 2577 scope.go:117] "RemoveContainer" containerID="29b583d0f36c8946abbf035315b9c7d82ba7121ffcc11cc9a1289956a1bd45e5" Apr 16 18:34:03.331740 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:03.331702 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-2m45b" event={"ID":"6514a34b-67e6-4daf-a518-91f2a7316066","Type":"ContainerStarted","Data":"03c496aa1c02e53c7ee4890373ba0c45ee8099df44f6a996d21f66cd218a05ad"} Apr 16 18:34:03.866869 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:03.866826 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" podUID="689117e1-30ad-4535-910e-895627fda928" containerName="registry" containerID="cri-o://a9c3f4bf7d479ffc06fe95cdf80687ef1b5036d22c7c21b9c6e6ef1083829ea4" gracePeriod=30 Apr 16 18:34:04.335710 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:04.335674 2577 generic.go:358] "Generic (PLEG): container finished" podID="48eda739-7e21-4258-b417-fc943a77343a" containerID="e213fcbc84b8ccb6247e136cde1c90004795502132cfd65742e641fd3498ce81" exitCode=0 Apr 16 18:34:04.336081 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:04.335716 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" event={"ID":"48eda739-7e21-4258-b417-fc943a77343a","Type":"ContainerDied","Data":"e213fcbc84b8ccb6247e136cde1c90004795502132cfd65742e641fd3498ce81"} Apr 16 18:34:04.336081 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:04.336052 2577 scope.go:117] "RemoveContainer" containerID="e213fcbc84b8ccb6247e136cde1c90004795502132cfd65742e641fd3498ce81" Apr 16 18:34:05.342783 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:05.342723 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-fq9mb" event={"ID":"48eda739-7e21-4258-b417-fc943a77343a","Type":"ContainerStarted","Data":"7b995d06def8de9e9f5d0d7ac09f01404a1272d7af148b9d12ff6500ce6e5ec2"} Apr 16 18:34:05.345716 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:05.345679 2577 generic.go:358] "Generic (PLEG): container finished" podID="689117e1-30ad-4535-910e-895627fda928" containerID="a9c3f4bf7d479ffc06fe95cdf80687ef1b5036d22c7c21b9c6e6ef1083829ea4" exitCode=0 Apr 16 18:34:05.345924 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:05.345828 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" event={"ID":"689117e1-30ad-4535-910e-895627fda928","Type":"ContainerDied","Data":"a9c3f4bf7d479ffc06fe95cdf80687ef1b5036d22c7c21b9c6e6ef1083829ea4"} Apr 16 18:34:05.345924 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:05.345864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" event={"ID":"689117e1-30ad-4535-910e-895627fda928","Type":"ContainerStarted","Data":"34eab5cbbfd25439e941cd2d640d17f214613d18182b7cef8f6460144e941040"} Apr 16 18:34:05.346427 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:05.346402 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:34:05.495608 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:05.495574 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ldpjc_2d4ed685-8585-4063-a50d-bab899fa550e/dns/0.log" Apr 16 18:34:05.511106 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:05.511081 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ldpjc_2d4ed685-8585-4063-a50d-bab899fa550e/kube-rbac-proxy/0.log" Apr 16 18:34:05.549489 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:05.549465 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7jl4x_b86bb118-f0ab-4605-860a-df81a23f9124/dns-node-resolver/0.log" Apr 16 18:34:14.384391 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.384329 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66b8666969-kdwvs" podUID="330da926-d1e5-49b9-b00b-8c71db5b4276" containerName="console" containerID="cri-o://62f01e72cb75f57a06eaad467b12e8df6a8a5e8a7535a3bd7ef650f65618f850" gracePeriod=15 Apr 16 18:34:14.666020 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.665994 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66b8666969-kdwvs_330da926-d1e5-49b9-b00b-8c71db5b4276/console/0.log" Apr 16 18:34:14.666125 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.666057 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:34:14.729289 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.729259 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkhqm\" (UniqueName: \"kubernetes.io/projected/330da926-d1e5-49b9-b00b-8c71db5b4276-kube-api-access-jkhqm\") pod \"330da926-d1e5-49b9-b00b-8c71db5b4276\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " Apr 16 18:34:14.729488 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.729316 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-console-config\") pod \"330da926-d1e5-49b9-b00b-8c71db5b4276\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " Apr 16 18:34:14.729488 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.729344 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/330da926-d1e5-49b9-b00b-8c71db5b4276-console-oauth-config\") pod \"330da926-d1e5-49b9-b00b-8c71db5b4276\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " Apr 16 18:34:14.729488 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.729388 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/330da926-d1e5-49b9-b00b-8c71db5b4276-console-serving-cert\") pod \"330da926-d1e5-49b9-b00b-8c71db5b4276\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " Apr 16 18:34:14.729488 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.729462 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-service-ca\") pod \"330da926-d1e5-49b9-b00b-8c71db5b4276\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " Apr 16 18:34:14.729678 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.729506 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-oauth-serving-cert\") pod \"330da926-d1e5-49b9-b00b-8c71db5b4276\" (UID: \"330da926-d1e5-49b9-b00b-8c71db5b4276\") " Apr 16 18:34:14.729942 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.729875 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-console-config" (OuterVolumeSpecName: "console-config") pod "330da926-d1e5-49b9-b00b-8c71db5b4276" (UID: "330da926-d1e5-49b9-b00b-8c71db5b4276"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:14.730060 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.729983 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-service-ca" (OuterVolumeSpecName: "service-ca") pod "330da926-d1e5-49b9-b00b-8c71db5b4276" (UID: "330da926-d1e5-49b9-b00b-8c71db5b4276"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:14.730172 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.730143 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "330da926-d1e5-49b9-b00b-8c71db5b4276" (UID: "330da926-d1e5-49b9-b00b-8c71db5b4276"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:14.731699 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.731665 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330da926-d1e5-49b9-b00b-8c71db5b4276-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "330da926-d1e5-49b9-b00b-8c71db5b4276" (UID: "330da926-d1e5-49b9-b00b-8c71db5b4276"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:14.731699 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.731691 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330da926-d1e5-49b9-b00b-8c71db5b4276-kube-api-access-jkhqm" (OuterVolumeSpecName: "kube-api-access-jkhqm") pod "330da926-d1e5-49b9-b00b-8c71db5b4276" (UID: "330da926-d1e5-49b9-b00b-8c71db5b4276"). InnerVolumeSpecName "kube-api-access-jkhqm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:34:14.731886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.731742 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330da926-d1e5-49b9-b00b-8c71db5b4276-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "330da926-d1e5-49b9-b00b-8c71db5b4276" (UID: "330da926-d1e5-49b9-b00b-8c71db5b4276"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:14.830322 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.830288 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-oauth-serving-cert\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:14.830322 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.830317 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jkhqm\" (UniqueName: \"kubernetes.io/projected/330da926-d1e5-49b9-b00b-8c71db5b4276-kube-api-access-jkhqm\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:14.830322 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.830328 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-console-config\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:14.830540 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.830337 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/330da926-d1e5-49b9-b00b-8c71db5b4276-console-oauth-config\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:14.830540 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.830346 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/330da926-d1e5-49b9-b00b-8c71db5b4276-console-serving-cert\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:14.830540 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:14.830355 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/330da926-d1e5-49b9-b00b-8c71db5b4276-service-ca\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:15.376073 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:15.376045 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66b8666969-kdwvs_330da926-d1e5-49b9-b00b-8c71db5b4276/console/0.log" Apr 16 18:34:15.376239 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:15.376083 2577 generic.go:358] "Generic (PLEG): container finished" podID="330da926-d1e5-49b9-b00b-8c71db5b4276" containerID="62f01e72cb75f57a06eaad467b12e8df6a8a5e8a7535a3bd7ef650f65618f850" exitCode=2 Apr 16 18:34:15.376239 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:15.376112 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66b8666969-kdwvs" event={"ID":"330da926-d1e5-49b9-b00b-8c71db5b4276","Type":"ContainerDied","Data":"62f01e72cb75f57a06eaad467b12e8df6a8a5e8a7535a3bd7ef650f65618f850"} Apr 16 18:34:15.376239 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:15.376151 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66b8666969-kdwvs" Apr 16 18:34:15.376239 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:15.376163 2577 scope.go:117] "RemoveContainer" containerID="62f01e72cb75f57a06eaad467b12e8df6a8a5e8a7535a3bd7ef650f65618f850" Apr 16 18:34:15.376398 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:15.376152 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66b8666969-kdwvs" event={"ID":"330da926-d1e5-49b9-b00b-8c71db5b4276","Type":"ContainerDied","Data":"6380e9d9dc8ed8ff57277a1cfdc1d3b806f9dd9063283bd6cb98d08f0494868f"} Apr 16 18:34:15.384736 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:15.384717 2577 scope.go:117] "RemoveContainer" containerID="62f01e72cb75f57a06eaad467b12e8df6a8a5e8a7535a3bd7ef650f65618f850" Apr 16 18:34:15.385034 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:34:15.385014 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f01e72cb75f57a06eaad467b12e8df6a8a5e8a7535a3bd7ef650f65618f850\": container with ID starting with 62f01e72cb75f57a06eaad467b12e8df6a8a5e8a7535a3bd7ef650f65618f850 not found: ID does not exist" containerID="62f01e72cb75f57a06eaad467b12e8df6a8a5e8a7535a3bd7ef650f65618f850" Apr 16 18:34:15.385078 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:15.385042 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f01e72cb75f57a06eaad467b12e8df6a8a5e8a7535a3bd7ef650f65618f850"} err="failed to get container status \"62f01e72cb75f57a06eaad467b12e8df6a8a5e8a7535a3bd7ef650f65618f850\": rpc error: code = NotFound desc = could not find container \"62f01e72cb75f57a06eaad467b12e8df6a8a5e8a7535a3bd7ef650f65618f850\": container with ID starting with 62f01e72cb75f57a06eaad467b12e8df6a8a5e8a7535a3bd7ef650f65618f850 not found: ID does not exist" Apr 16 18:34:15.397784 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:15.397736 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66b8666969-kdwvs"] Apr 16 18:34:15.401671 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:15.401650 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66b8666969-kdwvs"] Apr 16 18:34:16.619158 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:16.619120 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="330da926-d1e5-49b9-b00b-8c71db5b4276" path="/var/lib/kubelet/pods/330da926-d1e5-49b9-b00b-8c71db5b4276/volumes" Apr 16 18:34:26.373302 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:26.373270 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:34:26.373759 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:26.373714 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="alertmanager" containerID="cri-o://7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b" gracePeriod=120 Apr 16 18:34:26.373836 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:26.373802 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="kube-rbac-proxy-metric" containerID="cri-o://9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389" gracePeriod=120 Apr 16 18:34:26.373903 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:26.373813 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="kube-rbac-proxy" containerID="cri-o://ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9" gracePeriod=120 Apr 16 18:34:26.373903 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:26.373815 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="kube-rbac-proxy-web" containerID="cri-o://41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558" gracePeriod=120 Apr 16 18:34:26.373903 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:26.373830 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="config-reloader" containerID="cri-o://0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7" gracePeriod=120 Apr 16 18:34:26.374058 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:26.373932 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="prom-label-proxy" containerID="cri-o://50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5" gracePeriod=120 Apr 16 18:34:27.356667 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.356639 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-84b8f69b7d-8pb5p" Apr 16 18:34:27.420253 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.420214 2577 generic.go:358] "Generic (PLEG): container finished" podID="69197d1a-adb4-458a-9b05-1e0d33350333" containerID="50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5" exitCode=0 Apr 16 18:34:27.420253 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.420238 2577 generic.go:358] "Generic (PLEG): container finished" podID="69197d1a-adb4-458a-9b05-1e0d33350333" containerID="ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9" exitCode=0 Apr 16 18:34:27.420253 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.420258 2577 generic.go:358] "Generic (PLEG): container finished" podID="69197d1a-adb4-458a-9b05-1e0d33350333" containerID="0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7" exitCode=0 Apr 16 18:34:27.420697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.420266 2577 generic.go:358] "Generic (PLEG): container finished" podID="69197d1a-adb4-458a-9b05-1e0d33350333" containerID="7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b" exitCode=0 Apr 16 18:34:27.420697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.420287 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerDied","Data":"50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5"} Apr 16 18:34:27.420697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.420330 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerDied","Data":"ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9"} Apr 16 18:34:27.420697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.420345 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerDied","Data":"0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7"} Apr 16 18:34:27.420697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.420359 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerDied","Data":"7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b"} Apr 16 18:34:27.617984 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.617932 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:27.738109 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.737641 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy-web\") pod \"69197d1a-adb4-458a-9b05-1e0d33350333\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " Apr 16 18:34:27.738373 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.738340 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69197d1a-adb4-458a-9b05-1e0d33350333-alertmanager-trusted-ca-bundle\") pod \"69197d1a-adb4-458a-9b05-1e0d33350333\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " Apr 16 18:34:27.738502 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.738489 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69197d1a-adb4-458a-9b05-1e0d33350333-metrics-client-ca\") pod \"69197d1a-adb4-458a-9b05-1e0d33350333\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " Apr 16 18:34:27.738611 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.738598 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/69197d1a-adb4-458a-9b05-1e0d33350333-alertmanager-main-db\") pod \"69197d1a-adb4-458a-9b05-1e0d33350333\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " Apr 16 18:34:27.738709 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.738698 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-config-volume\") pod \"69197d1a-adb4-458a-9b05-1e0d33350333\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " Apr 16 18:34:27.738841 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.738827 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-cluster-tls-config\") pod \"69197d1a-adb4-458a-9b05-1e0d33350333\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " Apr 16 18:34:27.738960 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.738948 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69197d1a-adb4-458a-9b05-1e0d33350333-tls-assets\") pod \"69197d1a-adb4-458a-9b05-1e0d33350333\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " Apr 16 18:34:27.739075 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.739063 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy\") pod \"69197d1a-adb4-458a-9b05-1e0d33350333\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " Apr 16 18:34:27.739168 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.739153 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy-metric\") pod \"69197d1a-adb4-458a-9b05-1e0d33350333\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " Apr 16 18:34:27.739287 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.739276 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69197d1a-adb4-458a-9b05-1e0d33350333-config-out\") pod \"69197d1a-adb4-458a-9b05-1e0d33350333\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " Apr 16 18:34:27.739393 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.739381 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-web-config\") pod \"69197d1a-adb4-458a-9b05-1e0d33350333\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " Apr 16 18:34:27.739516 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.739504 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-main-tls\") pod \"69197d1a-adb4-458a-9b05-1e0d33350333\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " Apr 16 18:34:27.739606 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.739595 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8jp9\" (UniqueName: \"kubernetes.io/projected/69197d1a-adb4-458a-9b05-1e0d33350333-kube-api-access-x8jp9\") pod \"69197d1a-adb4-458a-9b05-1e0d33350333\" (UID: \"69197d1a-adb4-458a-9b05-1e0d33350333\") " Apr 16 18:34:27.740836 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.740805 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69197d1a-adb4-458a-9b05-1e0d33350333-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "69197d1a-adb4-458a-9b05-1e0d33350333" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:27.742208 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.741532 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69197d1a-adb4-458a-9b05-1e0d33350333-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "69197d1a-adb4-458a-9b05-1e0d33350333" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:27.742208 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.741905 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69197d1a-adb4-458a-9b05-1e0d33350333-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "69197d1a-adb4-458a-9b05-1e0d33350333" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:34:27.743227 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.742931 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "69197d1a-adb4-458a-9b05-1e0d33350333" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:27.745273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.745242 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "69197d1a-adb4-458a-9b05-1e0d33350333" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:27.745507 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.745476 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "69197d1a-adb4-458a-9b05-1e0d33350333" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:27.746092 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.745737 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69197d1a-adb4-458a-9b05-1e0d33350333-config-out" (OuterVolumeSpecName: "config-out") pod "69197d1a-adb4-458a-9b05-1e0d33350333" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:34:27.746230 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.746201 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-config-volume" (OuterVolumeSpecName: "config-volume") pod "69197d1a-adb4-458a-9b05-1e0d33350333" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:27.746834 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.746542 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69197d1a-adb4-458a-9b05-1e0d33350333-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "69197d1a-adb4-458a-9b05-1e0d33350333" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:34:27.746962 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.746934 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "69197d1a-adb4-458a-9b05-1e0d33350333" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:27.747242 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.747223 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69197d1a-adb4-458a-9b05-1e0d33350333-kube-api-access-x8jp9" (OuterVolumeSpecName: "kube-api-access-x8jp9") pod "69197d1a-adb4-458a-9b05-1e0d33350333" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333"). InnerVolumeSpecName "kube-api-access-x8jp9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:34:27.752030 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.752006 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "69197d1a-adb4-458a-9b05-1e0d33350333" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:27.757848 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.757829 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-web-config" (OuterVolumeSpecName: "web-config") pod "69197d1a-adb4-458a-9b05-1e0d33350333" (UID: "69197d1a-adb4-458a-9b05-1e0d33350333"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:27.841384 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.841355 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-main-tls\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:27.841384 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.841380 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x8jp9\" (UniqueName: \"kubernetes.io/projected/69197d1a-adb4-458a-9b05-1e0d33350333-kube-api-access-x8jp9\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:27.841530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.841391 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:27.841530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.841402 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69197d1a-adb4-458a-9b05-1e0d33350333-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:27.841530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.841412 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69197d1a-adb4-458a-9b05-1e0d33350333-metrics-client-ca\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:27.841530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.841421 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/69197d1a-adb4-458a-9b05-1e0d33350333-alertmanager-main-db\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:27.841530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.841429 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-config-volume\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:27.841530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.841438 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-cluster-tls-config\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:27.841530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.841447 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69197d1a-adb4-458a-9b05-1e0d33350333-tls-assets\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:27.841530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.841456 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:27.841530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.841465 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:27.841530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.841474 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69197d1a-adb4-458a-9b05-1e0d33350333-config-out\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:27.841530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:27.841482 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69197d1a-adb4-458a-9b05-1e0d33350333-web-config\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:34:28.425321 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.425287 2577 generic.go:358] "Generic (PLEG): container finished" podID="69197d1a-adb4-458a-9b05-1e0d33350333" containerID="9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389" exitCode=0 Apr 16 18:34:28.425321 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.425313 2577 generic.go:358] "Generic (PLEG): container finished" podID="69197d1a-adb4-458a-9b05-1e0d33350333" containerID="41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558" exitCode=0 Apr 16 18:34:28.425747 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.425364 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerDied","Data":"9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389"} Apr 16 18:34:28.425747 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.425398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerDied","Data":"41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558"} Apr 16 18:34:28.425747 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.425409 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"69197d1a-adb4-458a-9b05-1e0d33350333","Type":"ContainerDied","Data":"bc6a773569fcd90a53ad057130449a2fa0011c338a70549094dc4b7f4b4211a4"} Apr 16 18:34:28.425747 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.425412 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.425747 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.425424 2577 scope.go:117] "RemoveContainer" containerID="50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5" Apr 16 18:34:28.433423 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.433401 2577 scope.go:117] "RemoveContainer" containerID="9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389" Apr 16 18:34:28.440465 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.440447 2577 scope.go:117] "RemoveContainer" containerID="ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9" Apr 16 18:34:28.447426 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.447399 2577 scope.go:117] "RemoveContainer" containerID="41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558" Apr 16 18:34:28.449484 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.449445 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:34:28.454657 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.454634 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:34:28.461017 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.461000 2577 scope.go:117] "RemoveContainer" containerID="0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7" Apr 16 18:34:28.467604 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.467588 2577 scope.go:117] "RemoveContainer" containerID="7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b" Apr 16 18:34:28.474209 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.474193 2577 scope.go:117] "RemoveContainer" containerID="15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8" Apr 16 18:34:28.480893 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.480878 2577 scope.go:117] "RemoveContainer" containerID="50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5" Apr 16 18:34:28.481150 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:34:28.481128 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5\": container with ID starting with 50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5 not found: ID does not exist" containerID="50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5" Apr 16 18:34:28.481199 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.481160 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5"} err="failed to get container status \"50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5\": rpc error: code = NotFound desc = could not find container \"50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5\": container with ID starting with 50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5 not found: ID does not exist" Apr 16 18:34:28.481199 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.481180 2577 scope.go:117] "RemoveContainer" containerID="9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389" Apr 16 18:34:28.481397 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:34:28.481382 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389\": container with ID starting with 9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389 not found: ID does not exist" containerID="9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389" Apr 16 18:34:28.481435 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.481401 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389"} err="failed to get container status \"9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389\": rpc error: code = NotFound desc = could not find container \"9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389\": container with ID starting with 9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389 not found: ID does not exist" Apr 16 18:34:28.481435 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.481414 2577 scope.go:117] "RemoveContainer" containerID="ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9" Apr 16 18:34:28.481616 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:34:28.481597 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9\": container with ID starting with ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9 not found: ID does not exist" containerID="ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9" Apr 16 18:34:28.481670 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.481624 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9"} err="failed to get container status \"ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9\": rpc error: code = NotFound desc = could not find container \"ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9\": container with ID starting with ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9 not found: ID does not exist" Apr 16 18:34:28.481670 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.481642 2577 scope.go:117] "RemoveContainer" containerID="41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558" Apr 16 18:34:28.482099 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:34:28.482070 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558\": container with ID starting with 41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558 not found: ID does not exist" containerID="41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558" Apr 16 18:34:28.482237 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.482100 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558"} err="failed to get container status \"41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558\": rpc error: code = NotFound desc = could not find container \"41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558\": container with ID starting with 41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558 not found: ID does not exist" Apr 16 18:34:28.482237 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.482121 2577 scope.go:117] "RemoveContainer" containerID="0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7" Apr 16 18:34:28.482459 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:34:28.482435 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7\": container with ID starting with 0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7 not found: ID does not exist" containerID="0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7" Apr 16 18:34:28.482543 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.482467 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7"} err="failed to get container status \"0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7\": rpc error: code = NotFound desc = could not find container \"0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7\": container with ID starting with 0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7 not found: ID does not exist" Apr 16 18:34:28.482543 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.482487 2577 scope.go:117] "RemoveContainer" containerID="7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b" Apr 16 18:34:28.482806 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:34:28.482783 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b\": container with ID starting with 7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b not found: ID does not exist" containerID="7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b" Apr 16 18:34:28.482886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.482812 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b"} err="failed to get container status \"7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b\": rpc error: code = NotFound desc = could not find container \"7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b\": container with ID starting with 7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b not found: ID does not exist" Apr 16 18:34:28.482886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.482832 2577 scope.go:117] "RemoveContainer" containerID="15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8" Apr 16 18:34:28.483087 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:34:28.483068 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8\": container with ID starting with 15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8 not found: ID does not exist" containerID="15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8" Apr 16 18:34:28.483159 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.483094 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8"} err="failed to get container status \"15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8\": rpc error: code = NotFound desc = could not find container \"15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8\": container with ID starting with 15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8 not found: ID does not exist" Apr 16 18:34:28.483159 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.483113 2577 scope.go:117] "RemoveContainer" containerID="50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5" Apr 16 18:34:28.483399 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.483376 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5"} err="failed to get container status \"50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5\": rpc error: code = NotFound desc = could not find container \"50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5\": container with ID starting with 50b7ffe9dc80a737132a35f2788e6d7d9f1c93ce101296667d5289caa42741b5 not found: ID does not exist" Apr 16 18:34:28.483448 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.483402 2577 scope.go:117] "RemoveContainer" containerID="9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389" Apr 16 18:34:28.483645 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.483619 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389"} err="failed to get container status \"9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389\": rpc error: code = NotFound desc = could not find container \"9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389\": container with ID starting with 9f8df2ad5625edb8d4861c2e3aa78125c7ce9d30693a18554fefecd501c05389 not found: ID does not exist" Apr 16 18:34:28.483645 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.483640 2577 scope.go:117] "RemoveContainer" containerID="ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9" Apr 16 18:34:28.483950 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.483929 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9"} err="failed to get container status \"ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9\": rpc error: code = NotFound desc = could not find container \"ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9\": container with ID starting with ed7a13665d105ce365957b55c2908e4573bf7c8171a561f3d5e422b451fd57a9 not found: ID does not exist" Apr 16 18:34:28.483950 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.483948 2577 scope.go:117] "RemoveContainer" containerID="41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558" Apr 16 18:34:28.484098 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484075 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:34:28.484179 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484158 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558"} err="failed to get container status \"41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558\": rpc error: code = NotFound desc = could not find container \"41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558\": container with ID starting with 41c0037287a00c8380a79581dd0733ede534844d5f2968f75bd2f520c4e8f558 not found: ID does not exist" Apr 16 18:34:28.484179 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484177 2577 scope.go:117] "RemoveContainer" containerID="0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7" Apr 16 18:34:28.484393 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484372 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7"} err="failed to get container status \"0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7\": rpc error: code = NotFound desc = could not find container \"0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7\": container with ID starting with 0171a53a5fdf353dc823590c5192627ba78d2212f1e69dac46f44c63a36365d7 not found: ID does not exist" Apr 16 18:34:28.484440 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484394 2577 scope.go:117] "RemoveContainer" containerID="7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b" Apr 16 18:34:28.484440 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484386 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="prom-label-proxy" Apr 16 18:34:28.484505 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484440 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="prom-label-proxy" Apr 16 18:34:28.484505 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484458 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="330da926-d1e5-49b9-b00b-8c71db5b4276" containerName="console" Apr 16 18:34:28.484505 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484464 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="330da926-d1e5-49b9-b00b-8c71db5b4276" containerName="console" Apr 16 18:34:28.484505 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484478 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="kube-rbac-proxy-metric" Apr 16 18:34:28.484505 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484486 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="kube-rbac-proxy-metric" Apr 16 18:34:28.484505 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484504 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="init-config-reloader" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484512 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="init-config-reloader" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484536 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="config-reloader" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484544 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="config-reloader" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484553 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="kube-rbac-proxy-web" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484560 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="kube-rbac-proxy-web" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484569 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="kube-rbac-proxy" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484577 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="kube-rbac-proxy" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484586 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="alertmanager" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484593 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="alertmanager" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484662 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="kube-rbac-proxy-metric" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484660 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b"} err="failed to get container status \"7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b\": rpc error: code = NotFound desc = could not find container \"7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b\": container with ID starting with 7cda41ddd795b58c0f730a94327578ca12c7cb28b515f6b494c0e099631d6f8b not found: ID does not exist" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484672 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="config-reloader" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484678 2577 scope.go:117] "RemoveContainer" containerID="15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484680 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="kube-rbac-proxy-web" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484727 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="alertmanager" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484734 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="prom-label-proxy" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484744 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" containerName="kube-rbac-proxy" Apr 16 18:34:28.484810 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484750 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="330da926-d1e5-49b9-b00b-8c71db5b4276" containerName="console" Apr 16 18:34:28.485347 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.484958 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8"} err="failed to get container status \"15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8\": rpc error: code = NotFound desc = could not find container \"15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8\": container with ID starting with 15b6ae2628f1abe2c6521b02ffa0724d9d24edd3261e171857fa21393f2ac1d8 not found: ID does not exist" Apr 16 18:34:28.487988 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.487972 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.491101 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.491080 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:34:28.491191 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.491081 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:34:28.491191 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.491154 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:34:28.491401 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.491386 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:34:28.491517 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.491500 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:34:28.491884 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.491865 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:34:28.491995 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.491913 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:34:28.491995 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.491922 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:34:28.492098 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.492023 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-66k7k\"" Apr 16 18:34:28.497270 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.497253 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:34:28.504738 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.504717 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:34:28.618676 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.618644 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69197d1a-adb4-458a-9b05-1e0d33350333" path="/var/lib/kubelet/pods/69197d1a-adb4-458a-9b05-1e0d33350333/volumes" Apr 16 18:34:28.649176 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.649151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-web-config\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.649299 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.649181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b68e9f68-7281-4657-935d-4795df995d7d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.649299 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.649206 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg4lv\" (UniqueName: \"kubernetes.io/projected/b68e9f68-7281-4657-935d-4795df995d7d-kube-api-access-kg4lv\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.649299 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.649282 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.649399 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.649330 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b68e9f68-7281-4657-935d-4795df995d7d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.649399 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.649356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b68e9f68-7281-4657-935d-4795df995d7d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.649399 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.649376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.649399 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.649391 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b68e9f68-7281-4657-935d-4795df995d7d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.649514 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.649409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-config-volume\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.649514 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.649431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.649514 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.649462 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b68e9f68-7281-4657-935d-4795df995d7d-config-out\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.649514 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.649477 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.649514 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.649499 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.750274 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.750193 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.750391 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.750283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-web-config\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.750391 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.750322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b68e9f68-7281-4657-935d-4795df995d7d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.750391 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.750343 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kg4lv\" (UniqueName: \"kubernetes.io/projected/b68e9f68-7281-4657-935d-4795df995d7d-kube-api-access-kg4lv\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.750391 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.750381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.750582 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.750423 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b68e9f68-7281-4657-935d-4795df995d7d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.750582 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.750450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b68e9f68-7281-4657-935d-4795df995d7d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.750582 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.750477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.750582 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.750498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b68e9f68-7281-4657-935d-4795df995d7d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.750582 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.750524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-config-volume\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.750582 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.750545 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.750879 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.750588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b68e9f68-7281-4657-935d-4795df995d7d-config-out\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.750879 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.750615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.752622 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.751261 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b68e9f68-7281-4657-935d-4795df995d7d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.752622 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.751471 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b68e9f68-7281-4657-935d-4795df995d7d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.752622 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.752110 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b68e9f68-7281-4657-935d-4795df995d7d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.753253 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.753225 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b68e9f68-7281-4657-935d-4795df995d7d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.753345 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.753257 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b68e9f68-7281-4657-935d-4795df995d7d-config-out\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.753429 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.753398 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.753662 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.753604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-web-config\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.753662 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.753620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.754451 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.754426 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.755450 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.755423 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.755530 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.755473 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.755696 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.755678 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b68e9f68-7281-4657-935d-4795df995d7d-config-volume\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.759225 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.759199 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg4lv\" (UniqueName: \"kubernetes.io/projected/b68e9f68-7281-4657-935d-4795df995d7d-kube-api-access-kg4lv\") pod \"alertmanager-main-0\" (UID: \"b68e9f68-7281-4657-935d-4795df995d7d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.797030 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.796984 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:28.929372 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:28.929348 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:34:28.931846 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:34:28.931815 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb68e9f68_7281_4657_935d_4795df995d7d.slice/crio-0c38edcb54821a53acd05ae7e2cfb46a03247dab7a1c57eb9bf5d3121078223b WatchSource:0}: Error finding container 0c38edcb54821a53acd05ae7e2cfb46a03247dab7a1c57eb9bf5d3121078223b: Status 404 returned error can't find the container with id 0c38edcb54821a53acd05ae7e2cfb46a03247dab7a1c57eb9bf5d3121078223b Apr 16 18:34:29.429136 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:29.429099 2577 generic.go:358] "Generic (PLEG): container finished" podID="b68e9f68-7281-4657-935d-4795df995d7d" containerID="641452307d74aef9e4b83b527572acf571f3c110ad89d1d9e6f554927347d4e4" exitCode=0 Apr 16 18:34:29.429548 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:29.429178 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b68e9f68-7281-4657-935d-4795df995d7d","Type":"ContainerDied","Data":"641452307d74aef9e4b83b527572acf571f3c110ad89d1d9e6f554927347d4e4"} Apr 16 18:34:29.429548 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:29.429214 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b68e9f68-7281-4657-935d-4795df995d7d","Type":"ContainerStarted","Data":"0c38edcb54821a53acd05ae7e2cfb46a03247dab7a1c57eb9bf5d3121078223b"} Apr 16 18:34:30.436045 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:30.436009 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b68e9f68-7281-4657-935d-4795df995d7d","Type":"ContainerStarted","Data":"d66f47805a6642b53cfc627ace1d91232195722cff274a632586bb6fff4cbcf7"} Apr 16 18:34:30.436045 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:30.436043 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b68e9f68-7281-4657-935d-4795df995d7d","Type":"ContainerStarted","Data":"13c709711ac21ce2fbe7da127511e834f338619f4459a11f0f7fe662630b2e50"} Apr 16 18:34:30.436045 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:30.436053 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b68e9f68-7281-4657-935d-4795df995d7d","Type":"ContainerStarted","Data":"6ada92bd8c37add3b7dc192cec57fdbf5e67bac0e49d0fd3b45d06c97050663a"} Apr 16 18:34:30.436602 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:30.436062 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b68e9f68-7281-4657-935d-4795df995d7d","Type":"ContainerStarted","Data":"e2314e67d21e974f0429413a580003894de94e370c4606f7316816855bfd65c9"} Apr 16 18:34:30.436602 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:30.436070 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b68e9f68-7281-4657-935d-4795df995d7d","Type":"ContainerStarted","Data":"24a3cb0407556602e3a25abe775f2d0fc28cb50e63d3529496b0b9d4be57c0f2"} Apr 16 18:34:30.436602 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:30.436077 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b68e9f68-7281-4657-935d-4795df995d7d","Type":"ContainerStarted","Data":"daeb4abbceb64c9a9483cb7b4db477b53ea8e30a1fdd263c2f691b5a678a0226"} Apr 16 18:34:30.466749 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:30.466700 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.466687834 podStartE2EDuration="2.466687834s" podCreationTimestamp="2026-04-16 18:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:34:30.464935324 +0000 UTC m=+232.410366034" watchObservedRunningTime="2026-04-16 18:34:30.466687834 +0000 UTC m=+232.412118541" Apr 16 18:34:33.153161 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.153132 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b486f9796-f9bkd"] Apr 16 18:34:33.155855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.155830 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.176217 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.176186 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b486f9796-f9bkd"] Apr 16 18:34:33.288619 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.288591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4b79\" (UniqueName: \"kubernetes.io/projected/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-kube-api-access-n4b79\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.288805 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.288645 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-config\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.288805 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.288673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-oauth-config\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.288805 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.288693 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-oauth-serving-cert\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.288805 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.288738 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-trusted-ca-bundle\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.288805 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.288761 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-serving-cert\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.288977 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.288814 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-service-ca\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.389332 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.389301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-trusted-ca-bundle\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.389332 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.389336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-serving-cert\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.389529 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.389359 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-service-ca\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.389529 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.389413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4b79\" (UniqueName: \"kubernetes.io/projected/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-kube-api-access-n4b79\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.389529 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.389476 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-config\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.389529 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.389509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-oauth-config\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.389687 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.389536 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-oauth-serving-cert\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.390270 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.390239 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-service-ca\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.390396 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.390297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-trusted-ca-bundle\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.390396 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.390290 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-oauth-serving-cert\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.390396 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.390331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-config\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.392038 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.392012 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-oauth-config\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.392148 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.392082 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-serving-cert\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.400415 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.400387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4b79\" (UniqueName: \"kubernetes.io/projected/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-kube-api-access-n4b79\") pod \"console-b486f9796-f9bkd\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.465418 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.465346 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:33.810723 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:33.810698 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b486f9796-f9bkd"] Apr 16 18:34:33.812309 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:34:33.812284 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e738a81_6ffa_4f25_aba7_a90f9e0b31e4.slice/crio-e7e0b92e0695dad4a1705f8b6cb9aef18a5e4e1fe9f8f7d8978e654b1b4adeb7 WatchSource:0}: Error finding container e7e0b92e0695dad4a1705f8b6cb9aef18a5e4e1fe9f8f7d8978e654b1b4adeb7: Status 404 returned error can't find the container with id e7e0b92e0695dad4a1705f8b6cb9aef18a5e4e1fe9f8f7d8978e654b1b4adeb7 Apr 16 18:34:34.450144 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:34.450105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b486f9796-f9bkd" event={"ID":"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4","Type":"ContainerStarted","Data":"9ea8c4a012a3c6e2a680722ba48a3e727db9bbe7956edc3ec744bd434fc61d64"} Apr 16 18:34:34.450144 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:34.450148 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b486f9796-f9bkd" event={"ID":"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4","Type":"ContainerStarted","Data":"e7e0b92e0695dad4a1705f8b6cb9aef18a5e4e1fe9f8f7d8978e654b1b4adeb7"} Apr 16 18:34:34.469417 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:34.469368 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b486f9796-f9bkd" podStartSLOduration=1.469351072 podStartE2EDuration="1.469351072s" podCreationTimestamp="2026-04-16 18:34:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:34:34.467867769 +0000 UTC m=+236.413298481" watchObservedRunningTime="2026-04-16 18:34:34.469351072 +0000 UTC m=+236.414781785" Apr 16 18:34:43.466197 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:43.466143 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:43.466587 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:43.466231 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:43.470930 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:43.470909 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:43.483033 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:43.483009 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:34:43.545913 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:34:43.545881 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-df8555c5b-spxfk"] Apr 16 18:35:08.573091 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.572981 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-df8555c5b-spxfk" podUID="7e726afe-85e3-44c2-b794-d40597ecc578" containerName="console" containerID="cri-o://a285afdb650eb83ca3e3cf05bd49f2f9d3bd9a805c86af17b1107ef12410f034" gracePeriod=15 Apr 16 18:35:08.810807 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.810787 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-df8555c5b-spxfk_7e726afe-85e3-44c2-b794-d40597ecc578/console/0.log" Apr 16 18:35:08.810917 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.810848 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:35:08.892731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.892659 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-console-config\") pod \"7e726afe-85e3-44c2-b794-d40597ecc578\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " Apr 16 18:35:08.892731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.892712 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-service-ca\") pod \"7e726afe-85e3-44c2-b794-d40597ecc578\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " Apr 16 18:35:08.892731 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.892729 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-oauth-serving-cert\") pod \"7e726afe-85e3-44c2-b794-d40597ecc578\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " Apr 16 18:35:08.893005 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.892760 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e726afe-85e3-44c2-b794-d40597ecc578-console-serving-cert\") pod \"7e726afe-85e3-44c2-b794-d40597ecc578\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " Apr 16 18:35:08.893005 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.892808 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7e726afe-85e3-44c2-b794-d40597ecc578-console-oauth-config\") pod \"7e726afe-85e3-44c2-b794-d40597ecc578\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " Apr 16 18:35:08.893005 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.892834 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-trusted-ca-bundle\") pod \"7e726afe-85e3-44c2-b794-d40597ecc578\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " Apr 16 18:35:08.893005 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.892867 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m86bb\" (UniqueName: \"kubernetes.io/projected/7e726afe-85e3-44c2-b794-d40597ecc578-kube-api-access-m86bb\") pod \"7e726afe-85e3-44c2-b794-d40597ecc578\" (UID: \"7e726afe-85e3-44c2-b794-d40597ecc578\") " Apr 16 18:35:08.893204 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.893081 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-service-ca" (OuterVolumeSpecName: "service-ca") pod "7e726afe-85e3-44c2-b794-d40597ecc578" (UID: "7e726afe-85e3-44c2-b794-d40597ecc578"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:35:08.893252 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.893214 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7e726afe-85e3-44c2-b794-d40597ecc578" (UID: "7e726afe-85e3-44c2-b794-d40597ecc578"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:35:08.893337 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.893302 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-console-config" (OuterVolumeSpecName: "console-config") pod "7e726afe-85e3-44c2-b794-d40597ecc578" (UID: "7e726afe-85e3-44c2-b794-d40597ecc578"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:35:08.893513 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.893488 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7e726afe-85e3-44c2-b794-d40597ecc578" (UID: "7e726afe-85e3-44c2-b794-d40597ecc578"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:35:08.894984 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.894957 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e726afe-85e3-44c2-b794-d40597ecc578-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7e726afe-85e3-44c2-b794-d40597ecc578" (UID: "7e726afe-85e3-44c2-b794-d40597ecc578"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:35:08.895077 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.895039 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e726afe-85e3-44c2-b794-d40597ecc578-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7e726afe-85e3-44c2-b794-d40597ecc578" (UID: "7e726afe-85e3-44c2-b794-d40597ecc578"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:35:08.895120 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.895098 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e726afe-85e3-44c2-b794-d40597ecc578-kube-api-access-m86bb" (OuterVolumeSpecName: "kube-api-access-m86bb") pod "7e726afe-85e3-44c2-b794-d40597ecc578" (UID: "7e726afe-85e3-44c2-b794-d40597ecc578"). InnerVolumeSpecName "kube-api-access-m86bb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:35:08.993653 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.993619 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-console-config\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:35:08.993653 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.993647 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-service-ca\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:35:08.993653 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.993657 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-oauth-serving-cert\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:35:08.993914 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.993667 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e726afe-85e3-44c2-b794-d40597ecc578-console-serving-cert\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:35:08.993914 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.993675 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7e726afe-85e3-44c2-b794-d40597ecc578-console-oauth-config\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:35:08.993914 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.993684 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e726afe-85e3-44c2-b794-d40597ecc578-trusted-ca-bundle\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:35:08.993914 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:08.993692 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m86bb\" (UniqueName: \"kubernetes.io/projected/7e726afe-85e3-44c2-b794-d40597ecc578-kube-api-access-m86bb\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:35:09.556585 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:09.556556 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-df8555c5b-spxfk_7e726afe-85e3-44c2-b794-d40597ecc578/console/0.log" Apr 16 18:35:09.556802 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:09.556596 2577 generic.go:358] "Generic (PLEG): container finished" podID="7e726afe-85e3-44c2-b794-d40597ecc578" containerID="a285afdb650eb83ca3e3cf05bd49f2f9d3bd9a805c86af17b1107ef12410f034" exitCode=2 Apr 16 18:35:09.556802 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:09.556688 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df8555c5b-spxfk" Apr 16 18:35:09.556802 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:09.556691 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df8555c5b-spxfk" event={"ID":"7e726afe-85e3-44c2-b794-d40597ecc578","Type":"ContainerDied","Data":"a285afdb650eb83ca3e3cf05bd49f2f9d3bd9a805c86af17b1107ef12410f034"} Apr 16 18:35:09.556802 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:09.556735 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df8555c5b-spxfk" event={"ID":"7e726afe-85e3-44c2-b794-d40597ecc578","Type":"ContainerDied","Data":"c1d788d6c80ce76d0de10ae3fc4889cae70539cc8e64be2c40c466e4145418b7"} Apr 16 18:35:09.556802 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:09.556755 2577 scope.go:117] "RemoveContainer" containerID="a285afdb650eb83ca3e3cf05bd49f2f9d3bd9a805c86af17b1107ef12410f034" Apr 16 18:35:09.564712 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:09.564697 2577 scope.go:117] "RemoveContainer" containerID="a285afdb650eb83ca3e3cf05bd49f2f9d3bd9a805c86af17b1107ef12410f034" Apr 16 18:35:09.564987 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:35:09.564970 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a285afdb650eb83ca3e3cf05bd49f2f9d3bd9a805c86af17b1107ef12410f034\": container with ID starting with a285afdb650eb83ca3e3cf05bd49f2f9d3bd9a805c86af17b1107ef12410f034 not found: ID does not exist" containerID="a285afdb650eb83ca3e3cf05bd49f2f9d3bd9a805c86af17b1107ef12410f034" Apr 16 18:35:09.565037 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:09.564996 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a285afdb650eb83ca3e3cf05bd49f2f9d3bd9a805c86af17b1107ef12410f034"} err="failed to get container status \"a285afdb650eb83ca3e3cf05bd49f2f9d3bd9a805c86af17b1107ef12410f034\": rpc error: code = NotFound desc = could not find container \"a285afdb650eb83ca3e3cf05bd49f2f9d3bd9a805c86af17b1107ef12410f034\": container with ID starting with a285afdb650eb83ca3e3cf05bd49f2f9d3bd9a805c86af17b1107ef12410f034 not found: ID does not exist" Apr 16 18:35:09.577245 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:09.577223 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-df8555c5b-spxfk"] Apr 16 18:35:09.581049 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:09.581030 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-df8555c5b-spxfk"] Apr 16 18:35:10.617862 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:10.617829 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e726afe-85e3-44c2-b794-d40597ecc578" path="/var/lib/kubelet/pods/7e726afe-85e3-44c2-b794-d40597ecc578/volumes" Apr 16 18:35:38.508993 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:38.508958 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:35:38.509455 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:38.509209 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:35:38.511381 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:38.511359 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:35:38.511640 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:38.511620 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:35:38.517894 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:35:38.517865 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:38:40.475720 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.475686 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hzbsx"] Apr 16 18:38:40.476181 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.475996 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e726afe-85e3-44c2-b794-d40597ecc578" containerName="console" Apr 16 18:38:40.476181 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.476008 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e726afe-85e3-44c2-b794-d40597ecc578" containerName="console" Apr 16 18:38:40.476181 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.476083 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e726afe-85e3-44c2-b794-d40597ecc578" containerName="console" Apr 16 18:38:40.478759 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.478743 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hzbsx" Apr 16 18:38:40.481415 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.481392 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:38:40.494005 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.493984 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hzbsx"] Apr 16 18:38:40.507281 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.507256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ed188e37-3c6c-4aa4-9451-3d99128b9dec-dbus\") pod \"global-pull-secret-syncer-hzbsx\" (UID: \"ed188e37-3c6c-4aa4-9451-3d99128b9dec\") " pod="kube-system/global-pull-secret-syncer-hzbsx" Apr 16 18:38:40.507395 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.507329 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ed188e37-3c6c-4aa4-9451-3d99128b9dec-kubelet-config\") pod \"global-pull-secret-syncer-hzbsx\" (UID: \"ed188e37-3c6c-4aa4-9451-3d99128b9dec\") " pod="kube-system/global-pull-secret-syncer-hzbsx" Apr 16 18:38:40.507395 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.507385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ed188e37-3c6c-4aa4-9451-3d99128b9dec-original-pull-secret\") pod \"global-pull-secret-syncer-hzbsx\" (UID: \"ed188e37-3c6c-4aa4-9451-3d99128b9dec\") " pod="kube-system/global-pull-secret-syncer-hzbsx" Apr 16 18:38:40.608547 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.608509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ed188e37-3c6c-4aa4-9451-3d99128b9dec-dbus\") pod \"global-pull-secret-syncer-hzbsx\" (UID: \"ed188e37-3c6c-4aa4-9451-3d99128b9dec\") " pod="kube-system/global-pull-secret-syncer-hzbsx" Apr 16 18:38:40.608698 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.608557 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ed188e37-3c6c-4aa4-9451-3d99128b9dec-kubelet-config\") pod \"global-pull-secret-syncer-hzbsx\" (UID: \"ed188e37-3c6c-4aa4-9451-3d99128b9dec\") " pod="kube-system/global-pull-secret-syncer-hzbsx" Apr 16 18:38:40.608698 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.608596 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ed188e37-3c6c-4aa4-9451-3d99128b9dec-original-pull-secret\") pod \"global-pull-secret-syncer-hzbsx\" (UID: \"ed188e37-3c6c-4aa4-9451-3d99128b9dec\") " pod="kube-system/global-pull-secret-syncer-hzbsx" Apr 16 18:38:40.608806 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.608720 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ed188e37-3c6c-4aa4-9451-3d99128b9dec-dbus\") pod \"global-pull-secret-syncer-hzbsx\" (UID: \"ed188e37-3c6c-4aa4-9451-3d99128b9dec\") " pod="kube-system/global-pull-secret-syncer-hzbsx" Apr 16 18:38:40.608806 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.608721 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ed188e37-3c6c-4aa4-9451-3d99128b9dec-kubelet-config\") pod \"global-pull-secret-syncer-hzbsx\" (UID: \"ed188e37-3c6c-4aa4-9451-3d99128b9dec\") " pod="kube-system/global-pull-secret-syncer-hzbsx" Apr 16 18:38:40.610950 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.610921 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ed188e37-3c6c-4aa4-9451-3d99128b9dec-original-pull-secret\") pod \"global-pull-secret-syncer-hzbsx\" (UID: \"ed188e37-3c6c-4aa4-9451-3d99128b9dec\") " pod="kube-system/global-pull-secret-syncer-hzbsx" Apr 16 18:38:40.787603 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.787528 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hzbsx" Apr 16 18:38:40.903996 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.903970 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hzbsx"] Apr 16 18:38:40.906787 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:38:40.906732 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded188e37_3c6c_4aa4_9451_3d99128b9dec.slice/crio-d0c9b7decf64636a49815d1e18c343ea2fbfb50a2bcf481790bd8e2a5e82305c WatchSource:0}: Error finding container d0c9b7decf64636a49815d1e18c343ea2fbfb50a2bcf481790bd8e2a5e82305c: Status 404 returned error can't find the container with id d0c9b7decf64636a49815d1e18c343ea2fbfb50a2bcf481790bd8e2a5e82305c Apr 16 18:38:40.908482 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:40.908462 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:38:41.178196 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:41.178157 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hzbsx" event={"ID":"ed188e37-3c6c-4aa4-9451-3d99128b9dec","Type":"ContainerStarted","Data":"d0c9b7decf64636a49815d1e18c343ea2fbfb50a2bcf481790bd8e2a5e82305c"} Apr 16 18:38:45.191779 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:45.191739 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hzbsx" event={"ID":"ed188e37-3c6c-4aa4-9451-3d99128b9dec","Type":"ContainerStarted","Data":"4fec38170492ee43b91f6a63d1f60bb9fa036e8fb719f09594c4b377640afeaa"} Apr 16 18:38:45.217951 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:38:45.217896 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hzbsx" podStartSLOduration=1.358528001 podStartE2EDuration="5.21788181s" podCreationTimestamp="2026-04-16 18:38:40 +0000 UTC" firstStartedPulling="2026-04-16 18:38:40.90858948 +0000 UTC m=+482.854020167" lastFinishedPulling="2026-04-16 18:38:44.767943284 +0000 UTC m=+486.713373976" observedRunningTime="2026-04-16 18:38:45.215708675 +0000 UTC m=+487.161139385" watchObservedRunningTime="2026-04-16 18:38:45.21788181 +0000 UTC m=+487.163312519" Apr 16 18:39:03.076861 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.076823 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9"] Apr 16 18:39:03.081796 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.081019 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" Apr 16 18:39:03.084622 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.084593 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:39:03.084726 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.084640 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-xp2xm\"" Apr 16 18:39:03.085294 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.085274 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:39:03.098656 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.098632 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9"] Apr 16 18:39:03.195688 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.195651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/618bcbae-118c-4e47-992d-d9d422d6a8ca-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9\" (UID: \"618bcbae-118c-4e47-992d-d9d422d6a8ca\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" Apr 16 18:39:03.195688 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.195687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8629\" (UniqueName: \"kubernetes.io/projected/618bcbae-118c-4e47-992d-d9d422d6a8ca-kube-api-access-w8629\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9\" (UID: \"618bcbae-118c-4e47-992d-d9d422d6a8ca\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" Apr 16 18:39:03.195913 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.195713 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/618bcbae-118c-4e47-992d-d9d422d6a8ca-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9\" (UID: \"618bcbae-118c-4e47-992d-d9d422d6a8ca\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" Apr 16 18:39:03.296422 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.296389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/618bcbae-118c-4e47-992d-d9d422d6a8ca-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9\" (UID: \"618bcbae-118c-4e47-992d-d9d422d6a8ca\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" Apr 16 18:39:03.296422 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.296427 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8629\" (UniqueName: \"kubernetes.io/projected/618bcbae-118c-4e47-992d-d9d422d6a8ca-kube-api-access-w8629\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9\" (UID: \"618bcbae-118c-4e47-992d-d9d422d6a8ca\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" Apr 16 18:39:03.296623 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.296462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/618bcbae-118c-4e47-992d-d9d422d6a8ca-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9\" (UID: \"618bcbae-118c-4e47-992d-d9d422d6a8ca\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" Apr 16 18:39:03.296973 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.296952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/618bcbae-118c-4e47-992d-d9d422d6a8ca-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9\" (UID: \"618bcbae-118c-4e47-992d-d9d422d6a8ca\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" Apr 16 18:39:03.296973 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.296967 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/618bcbae-118c-4e47-992d-d9d422d6a8ca-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9\" (UID: \"618bcbae-118c-4e47-992d-d9d422d6a8ca\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" Apr 16 18:39:03.305572 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.305546 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8629\" (UniqueName: \"kubernetes.io/projected/618bcbae-118c-4e47-992d-d9d422d6a8ca-kube-api-access-w8629\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9\" (UID: \"618bcbae-118c-4e47-992d-d9d422d6a8ca\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" Apr 16 18:39:03.392014 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.391944 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" Apr 16 18:39:03.520412 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:03.520353 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9"] Apr 16 18:39:03.523578 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:39:03.523542 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod618bcbae_118c_4e47_992d_d9d422d6a8ca.slice/crio-419881a709c6ee263fd86f6f03d8ece7c41975bb6106c37151fa77cfe0b5f451 WatchSource:0}: Error finding container 419881a709c6ee263fd86f6f03d8ece7c41975bb6106c37151fa77cfe0b5f451: Status 404 returned error can't find the container with id 419881a709c6ee263fd86f6f03d8ece7c41975bb6106c37151fa77cfe0b5f451 Apr 16 18:39:04.249707 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:04.249667 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" event={"ID":"618bcbae-118c-4e47-992d-d9d422d6a8ca","Type":"ContainerStarted","Data":"419881a709c6ee263fd86f6f03d8ece7c41975bb6106c37151fa77cfe0b5f451"} Apr 16 18:39:09.268128 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:09.268093 2577 generic.go:358] "Generic (PLEG): container finished" podID="618bcbae-118c-4e47-992d-d9d422d6a8ca" containerID="a72dd02abe1e23ac16361ba22abc5a09741795152a504a4bf4e82b1e26a9b642" exitCode=0 Apr 16 18:39:09.268516 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:09.268180 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" event={"ID":"618bcbae-118c-4e47-992d-d9d422d6a8ca","Type":"ContainerDied","Data":"a72dd02abe1e23ac16361ba22abc5a09741795152a504a4bf4e82b1e26a9b642"} Apr 16 18:39:11.277094 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:11.277061 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" event={"ID":"618bcbae-118c-4e47-992d-d9d422d6a8ca","Type":"ContainerStarted","Data":"9ee645a7eeb36710b7ed801e6a405ec69cb84697e38a89f2ff8b4c30ce560cbc"} Apr 16 18:39:12.281318 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:12.281284 2577 generic.go:358] "Generic (PLEG): container finished" podID="618bcbae-118c-4e47-992d-d9d422d6a8ca" containerID="9ee645a7eeb36710b7ed801e6a405ec69cb84697e38a89f2ff8b4c30ce560cbc" exitCode=0 Apr 16 18:39:12.281756 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:12.281331 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" event={"ID":"618bcbae-118c-4e47-992d-d9d422d6a8ca","Type":"ContainerDied","Data":"9ee645a7eeb36710b7ed801e6a405ec69cb84697e38a89f2ff8b4c30ce560cbc"} Apr 16 18:39:19.309752 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:19.309715 2577 generic.go:358] "Generic (PLEG): container finished" podID="618bcbae-118c-4e47-992d-d9d422d6a8ca" containerID="0d21e40dd7426585d189c20d5879651d10bc7a721e06ee267b6eff42757ea722" exitCode=0 Apr 16 18:39:19.309752 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:19.309754 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" event={"ID":"618bcbae-118c-4e47-992d-d9d422d6a8ca","Type":"ContainerDied","Data":"0d21e40dd7426585d189c20d5879651d10bc7a721e06ee267b6eff42757ea722"} Apr 16 18:39:20.434890 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:20.434864 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" Apr 16 18:39:20.552969 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:20.552932 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/618bcbae-118c-4e47-992d-d9d422d6a8ca-bundle\") pod \"618bcbae-118c-4e47-992d-d9d422d6a8ca\" (UID: \"618bcbae-118c-4e47-992d-d9d422d6a8ca\") " Apr 16 18:39:20.553147 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:20.552986 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/618bcbae-118c-4e47-992d-d9d422d6a8ca-util\") pod \"618bcbae-118c-4e47-992d-d9d422d6a8ca\" (UID: \"618bcbae-118c-4e47-992d-d9d422d6a8ca\") " Apr 16 18:39:20.553147 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:20.553013 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8629\" (UniqueName: \"kubernetes.io/projected/618bcbae-118c-4e47-992d-d9d422d6a8ca-kube-api-access-w8629\") pod \"618bcbae-118c-4e47-992d-d9d422d6a8ca\" (UID: \"618bcbae-118c-4e47-992d-d9d422d6a8ca\") " Apr 16 18:39:20.553463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:20.553440 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/618bcbae-118c-4e47-992d-d9d422d6a8ca-bundle" (OuterVolumeSpecName: "bundle") pod "618bcbae-118c-4e47-992d-d9d422d6a8ca" (UID: "618bcbae-118c-4e47-992d-d9d422d6a8ca"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:20.555307 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:20.555273 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/618bcbae-118c-4e47-992d-d9d422d6a8ca-kube-api-access-w8629" (OuterVolumeSpecName: "kube-api-access-w8629") pod "618bcbae-118c-4e47-992d-d9d422d6a8ca" (UID: "618bcbae-118c-4e47-992d-d9d422d6a8ca"). InnerVolumeSpecName "kube-api-access-w8629". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:39:20.557304 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:20.557276 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/618bcbae-118c-4e47-992d-d9d422d6a8ca-util" (OuterVolumeSpecName: "util") pod "618bcbae-118c-4e47-992d-d9d422d6a8ca" (UID: "618bcbae-118c-4e47-992d-d9d422d6a8ca"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:39:20.653991 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:20.653964 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/618bcbae-118c-4e47-992d-d9d422d6a8ca-bundle\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:39:20.653991 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:20.653987 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/618bcbae-118c-4e47-992d-d9d422d6a8ca-util\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:39:20.654163 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:20.653996 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w8629\" (UniqueName: \"kubernetes.io/projected/618bcbae-118c-4e47-992d-d9d422d6a8ca-kube-api-access-w8629\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:39:21.318328 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:21.318252 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" Apr 16 18:39:21.318328 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:21.318258 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2t2t9" event={"ID":"618bcbae-118c-4e47-992d-d9d422d6a8ca","Type":"ContainerDied","Data":"419881a709c6ee263fd86f6f03d8ece7c41975bb6106c37151fa77cfe0b5f451"} Apr 16 18:39:21.318328 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:21.318287 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="419881a709c6ee263fd86f6f03d8ece7c41975bb6106c37151fa77cfe0b5f451" Apr 16 18:39:25.316046 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.316014 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww"] Apr 16 18:39:25.316414 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.316335 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="618bcbae-118c-4e47-992d-d9d422d6a8ca" containerName="extract" Apr 16 18:39:25.316414 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.316346 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="618bcbae-118c-4e47-992d-d9d422d6a8ca" containerName="extract" Apr 16 18:39:25.316414 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.316355 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="618bcbae-118c-4e47-992d-d9d422d6a8ca" containerName="util" Apr 16 18:39:25.316414 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.316361 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="618bcbae-118c-4e47-992d-d9d422d6a8ca" containerName="util" Apr 16 18:39:25.316414 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.316371 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="618bcbae-118c-4e47-992d-d9d422d6a8ca" containerName="pull" Apr 16 18:39:25.316414 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.316376 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="618bcbae-118c-4e47-992d-d9d422d6a8ca" containerName="pull" Apr 16 18:39:25.316641 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.316443 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="618bcbae-118c-4e47-992d-d9d422d6a8ca" containerName="extract" Apr 16 18:39:25.319201 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.319184 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww" Apr 16 18:39:25.322811 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.322748 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:39:25.323027 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.322998 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-c8h24\"" Apr 16 18:39:25.323714 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.323687 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:39:25.323833 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.323727 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:39:25.339619 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.339595 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww"] Apr 16 18:39:25.496958 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.496910 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scxbm\" (UniqueName: \"kubernetes.io/projected/0c5a286c-580b-4cbb-81a8-cb117a7e6c36-kube-api-access-scxbm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-w2sww\" (UID: \"0c5a286c-580b-4cbb-81a8-cb117a7e6c36\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww" Apr 16 18:39:25.497142 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.496979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0c5a286c-580b-4cbb-81a8-cb117a7e6c36-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-w2sww\" (UID: \"0c5a286c-580b-4cbb-81a8-cb117a7e6c36\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww" Apr 16 18:39:25.598316 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.598275 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scxbm\" (UniqueName: \"kubernetes.io/projected/0c5a286c-580b-4cbb-81a8-cb117a7e6c36-kube-api-access-scxbm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-w2sww\" (UID: \"0c5a286c-580b-4cbb-81a8-cb117a7e6c36\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww" Apr 16 18:39:25.598503 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.598340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0c5a286c-580b-4cbb-81a8-cb117a7e6c36-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-w2sww\" (UID: \"0c5a286c-580b-4cbb-81a8-cb117a7e6c36\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww" Apr 16 18:39:25.600569 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.600549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0c5a286c-580b-4cbb-81a8-cb117a7e6c36-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-w2sww\" (UID: \"0c5a286c-580b-4cbb-81a8-cb117a7e6c36\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww" Apr 16 18:39:25.607465 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.607442 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scxbm\" (UniqueName: \"kubernetes.io/projected/0c5a286c-580b-4cbb-81a8-cb117a7e6c36-kube-api-access-scxbm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-w2sww\" (UID: \"0c5a286c-580b-4cbb-81a8-cb117a7e6c36\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww" Apr 16 18:39:25.634355 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.634329 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww" Apr 16 18:39:25.760805 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:25.760762 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww"] Apr 16 18:39:25.763102 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:39:25.763062 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c5a286c_580b_4cbb_81a8_cb117a7e6c36.slice/crio-5e803daa9bf39eda1302bbe96ad3f21dc739601eefe36bacb6b467c291868940 WatchSource:0}: Error finding container 5e803daa9bf39eda1302bbe96ad3f21dc739601eefe36bacb6b467c291868940: Status 404 returned error can't find the container with id 5e803daa9bf39eda1302bbe96ad3f21dc739601eefe36bacb6b467c291868940 Apr 16 18:39:26.336062 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:26.336028 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww" event={"ID":"0c5a286c-580b-4cbb-81a8-cb117a7e6c36","Type":"ContainerStarted","Data":"5e803daa9bf39eda1302bbe96ad3f21dc739601eefe36bacb6b467c291868940"} Apr 16 18:39:29.348090 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.348054 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww" event={"ID":"0c5a286c-580b-4cbb-81a8-cb117a7e6c36","Type":"ContainerStarted","Data":"066c31089b14766589861ff911e5e7c3eab36ed897b9ebcaf97edd3d8f008ffa"} Apr 16 18:39:29.348467 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.348121 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww" Apr 16 18:39:29.384033 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.383984 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww" podStartSLOduration=1.148166333 podStartE2EDuration="4.383968892s" podCreationTimestamp="2026-04-16 18:39:25 +0000 UTC" firstStartedPulling="2026-04-16 18:39:25.764897217 +0000 UTC m=+527.710327905" lastFinishedPulling="2026-04-16 18:39:29.000699755 +0000 UTC m=+530.946130464" observedRunningTime="2026-04-16 18:39:29.383221275 +0000 UTC m=+531.328651984" watchObservedRunningTime="2026-04-16 18:39:29.383968892 +0000 UTC m=+531.329399602" Apr 16 18:39:29.742044 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.741957 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qmfld"] Apr 16 18:39:29.745216 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.745195 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:29.748399 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.748379 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 18:39:29.748501 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.748487 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:39:29.748706 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.748691 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-2zld6\"" Apr 16 18:39:29.756907 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.756883 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qmfld"] Apr 16 18:39:29.836677 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.836645 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgljb\" (UniqueName: \"kubernetes.io/projected/cb723b18-f165-4ff0-9525-feed4427be33-kube-api-access-vgljb\") pod \"keda-operator-ffbb595cb-qmfld\" (UID: \"cb723b18-f165-4ff0-9525-feed4427be33\") " pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:29.836862 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.836703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/cb723b18-f165-4ff0-9525-feed4427be33-cabundle0\") pod \"keda-operator-ffbb595cb-qmfld\" (UID: \"cb723b18-f165-4ff0-9525-feed4427be33\") " pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:29.836862 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.836810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cb723b18-f165-4ff0-9525-feed4427be33-certificates\") pod \"keda-operator-ffbb595cb-qmfld\" (UID: \"cb723b18-f165-4ff0-9525-feed4427be33\") " pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:29.937592 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.937560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cb723b18-f165-4ff0-9525-feed4427be33-certificates\") pod \"keda-operator-ffbb595cb-qmfld\" (UID: \"cb723b18-f165-4ff0-9525-feed4427be33\") " pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:29.937762 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.937638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgljb\" (UniqueName: \"kubernetes.io/projected/cb723b18-f165-4ff0-9525-feed4427be33-kube-api-access-vgljb\") pod \"keda-operator-ffbb595cb-qmfld\" (UID: \"cb723b18-f165-4ff0-9525-feed4427be33\") " pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:29.937762 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:39:29.937665 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:39:29.937762 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:39:29.937686 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:39:29.937762 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:39:29.937699 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qmfld: references non-existent secret key: ca.crt Apr 16 18:39:29.937762 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.937700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/cb723b18-f165-4ff0-9525-feed4427be33-cabundle0\") pod \"keda-operator-ffbb595cb-qmfld\" (UID: \"cb723b18-f165-4ff0-9525-feed4427be33\") " pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:29.938063 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:39:29.937853 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb723b18-f165-4ff0-9525-feed4427be33-certificates podName:cb723b18-f165-4ff0-9525-feed4427be33 nodeName:}" failed. No retries permitted until 2026-04-16 18:39:30.437798836 +0000 UTC m=+532.383229528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cb723b18-f165-4ff0-9525-feed4427be33-certificates") pod "keda-operator-ffbb595cb-qmfld" (UID: "cb723b18-f165-4ff0-9525-feed4427be33") : references non-existent secret key: ca.crt Apr 16 18:39:29.938941 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.938915 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/cb723b18-f165-4ff0-9525-feed4427be33-cabundle0\") pod \"keda-operator-ffbb595cb-qmfld\" (UID: \"cb723b18-f165-4ff0-9525-feed4427be33\") " pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:29.951180 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:29.951150 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgljb\" (UniqueName: \"kubernetes.io/projected/cb723b18-f165-4ff0-9525-feed4427be33-kube-api-access-vgljb\") pod \"keda-operator-ffbb595cb-qmfld\" (UID: \"cb723b18-f165-4ff0-9525-feed4427be33\") " pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:30.307044 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:30.307003 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-fzccc"] Apr 16 18:39:30.310843 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:30.310822 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-fzccc" Apr 16 18:39:30.313811 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:30.313786 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 18:39:30.326423 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:30.326397 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-fzccc"] Apr 16 18:39:30.341138 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:30.341093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7a0b4a25-8338-4b76-9f69-b24acbf46866-certificates\") pod \"keda-admission-cf49989db-fzccc\" (UID: \"7a0b4a25-8338-4b76-9f69-b24acbf46866\") " pod="openshift-keda/keda-admission-cf49989db-fzccc" Apr 16 18:39:30.341138 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:30.341138 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmttn\" (UniqueName: \"kubernetes.io/projected/7a0b4a25-8338-4b76-9f69-b24acbf46866-kube-api-access-fmttn\") pod \"keda-admission-cf49989db-fzccc\" (UID: \"7a0b4a25-8338-4b76-9f69-b24acbf46866\") " pod="openshift-keda/keda-admission-cf49989db-fzccc" Apr 16 18:39:30.441757 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:30.441719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cb723b18-f165-4ff0-9525-feed4427be33-certificates\") pod \"keda-operator-ffbb595cb-qmfld\" (UID: \"cb723b18-f165-4ff0-9525-feed4427be33\") " pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:30.442266 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:30.441821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7a0b4a25-8338-4b76-9f69-b24acbf46866-certificates\") pod \"keda-admission-cf49989db-fzccc\" (UID: \"7a0b4a25-8338-4b76-9f69-b24acbf46866\") " pod="openshift-keda/keda-admission-cf49989db-fzccc" Apr 16 18:39:30.442266 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:30.441846 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmttn\" (UniqueName: \"kubernetes.io/projected/7a0b4a25-8338-4b76-9f69-b24acbf46866-kube-api-access-fmttn\") pod \"keda-admission-cf49989db-fzccc\" (UID: \"7a0b4a25-8338-4b76-9f69-b24acbf46866\") " pod="openshift-keda/keda-admission-cf49989db-fzccc" Apr 16 18:39:30.442266 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:39:30.441949 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:39:30.442266 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:39:30.441974 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:39:30.442266 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:39:30.441988 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qmfld: references non-existent secret key: ca.crt Apr 16 18:39:30.442266 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:39:30.442068 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb723b18-f165-4ff0-9525-feed4427be33-certificates podName:cb723b18-f165-4ff0-9525-feed4427be33 nodeName:}" failed. No retries permitted until 2026-04-16 18:39:31.442047001 +0000 UTC m=+533.387477712 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cb723b18-f165-4ff0-9525-feed4427be33-certificates") pod "keda-operator-ffbb595cb-qmfld" (UID: "cb723b18-f165-4ff0-9525-feed4427be33") : references non-existent secret key: ca.crt Apr 16 18:39:30.444485 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:30.444459 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7a0b4a25-8338-4b76-9f69-b24acbf46866-certificates\") pod \"keda-admission-cf49989db-fzccc\" (UID: \"7a0b4a25-8338-4b76-9f69-b24acbf46866\") " pod="openshift-keda/keda-admission-cf49989db-fzccc" Apr 16 18:39:30.452425 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:30.452403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmttn\" (UniqueName: \"kubernetes.io/projected/7a0b4a25-8338-4b76-9f69-b24acbf46866-kube-api-access-fmttn\") pod \"keda-admission-cf49989db-fzccc\" (UID: \"7a0b4a25-8338-4b76-9f69-b24acbf46866\") " pod="openshift-keda/keda-admission-cf49989db-fzccc" Apr 16 18:39:30.622697 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:30.622670 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-fzccc" Apr 16 18:39:30.752526 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:30.752500 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-fzccc"] Apr 16 18:39:30.754953 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:39:30.754924 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a0b4a25_8338_4b76_9f69_b24acbf46866.slice/crio-722240cb496f2d6b4de7b91aba5c592f0361f8882e34c3006c323f00224059d2 WatchSource:0}: Error finding container 722240cb496f2d6b4de7b91aba5c592f0361f8882e34c3006c323f00224059d2: Status 404 returned error can't find the container with id 722240cb496f2d6b4de7b91aba5c592f0361f8882e34c3006c323f00224059d2 Apr 16 18:39:31.356903 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:31.356860 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-fzccc" event={"ID":"7a0b4a25-8338-4b76-9f69-b24acbf46866","Type":"ContainerStarted","Data":"722240cb496f2d6b4de7b91aba5c592f0361f8882e34c3006c323f00224059d2"} Apr 16 18:39:31.449708 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:31.449668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cb723b18-f165-4ff0-9525-feed4427be33-certificates\") pod \"keda-operator-ffbb595cb-qmfld\" (UID: \"cb723b18-f165-4ff0-9525-feed4427be33\") " pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:31.452121 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:31.452088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cb723b18-f165-4ff0-9525-feed4427be33-certificates\") pod \"keda-operator-ffbb595cb-qmfld\" (UID: \"cb723b18-f165-4ff0-9525-feed4427be33\") " pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:31.555095 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:31.555057 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:31.694832 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:31.694801 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qmfld"] Apr 16 18:39:31.696839 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:39:31.696807 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb723b18_f165_4ff0_9525_feed4427be33.slice/crio-16df5b7e8a80d39e3515e79bc2adae5eaad14f5b018a0382ee79ddb828c44402 WatchSource:0}: Error finding container 16df5b7e8a80d39e3515e79bc2adae5eaad14f5b018a0382ee79ddb828c44402: Status 404 returned error can't find the container with id 16df5b7e8a80d39e3515e79bc2adae5eaad14f5b018a0382ee79ddb828c44402 Apr 16 18:39:32.362159 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:32.362123 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-fzccc" event={"ID":"7a0b4a25-8338-4b76-9f69-b24acbf46866","Type":"ContainerStarted","Data":"927fecef20fab38568c44ab124b7fcfa7637c5ac8fb04261477e7cfb6c0b3998"} Apr 16 18:39:32.362366 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:32.362195 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-fzccc" Apr 16 18:39:32.363415 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:32.363388 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-qmfld" event={"ID":"cb723b18-f165-4ff0-9525-feed4427be33","Type":"ContainerStarted","Data":"16df5b7e8a80d39e3515e79bc2adae5eaad14f5b018a0382ee79ddb828c44402"} Apr 16 18:39:32.382247 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:32.382199 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-fzccc" podStartSLOduration=0.919431921 podStartE2EDuration="2.38218566s" podCreationTimestamp="2026-04-16 18:39:30 +0000 UTC" firstStartedPulling="2026-04-16 18:39:30.756120853 +0000 UTC m=+532.701551541" lastFinishedPulling="2026-04-16 18:39:32.218874582 +0000 UTC m=+534.164305280" observedRunningTime="2026-04-16 18:39:32.379863077 +0000 UTC m=+534.325293789" watchObservedRunningTime="2026-04-16 18:39:32.38218566 +0000 UTC m=+534.327616370" Apr 16 18:39:35.374478 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:35.374445 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-qmfld" event={"ID":"cb723b18-f165-4ff0-9525-feed4427be33","Type":"ContainerStarted","Data":"f318a871808bd32223a27b2676d451fcf3f3f92c3b1405408cf7d15d9546d6fe"} Apr 16 18:39:35.374866 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:35.374564 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:39:35.396370 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:35.396324 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-qmfld" podStartSLOduration=3.584954804 podStartE2EDuration="6.396312268s" podCreationTimestamp="2026-04-16 18:39:29 +0000 UTC" firstStartedPulling="2026-04-16 18:39:31.698423646 +0000 UTC m=+533.643854334" lastFinishedPulling="2026-04-16 18:39:34.509781105 +0000 UTC m=+536.455211798" observedRunningTime="2026-04-16 18:39:35.394967014 +0000 UTC m=+537.340397723" watchObservedRunningTime="2026-04-16 18:39:35.396312268 +0000 UTC m=+537.341742977" Apr 16 18:39:50.354703 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:50.354671 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-w2sww" Apr 16 18:39:53.369610 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:53.369566 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-fzccc" Apr 16 18:39:56.380098 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:39:56.380070 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-qmfld" Apr 16 18:40:38.201576 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.201539 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-74rb9"] Apr 16 18:40:38.204984 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.204962 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-74rb9" Apr 16 18:40:38.207476 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.207449 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:40:38.208279 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.208260 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-zldsj\"" Apr 16 18:40:38.208372 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.208306 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:40:38.213361 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.213342 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:40:38.221549 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.221522 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-74rb9"] Apr 16 18:40:38.338866 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.338827 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcd4s\" (UniqueName: \"kubernetes.io/projected/67781650-8927-4cc0-9570-d23f75a9eed9-kube-api-access-bcd4s\") pod \"llmisvc-controller-manager-68cc5db7c4-74rb9\" (UID: \"67781650-8927-4cc0-9570-d23f75a9eed9\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-74rb9" Apr 16 18:40:38.338866 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.338873 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67781650-8927-4cc0-9570-d23f75a9eed9-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-74rb9\" (UID: \"67781650-8927-4cc0-9570-d23f75a9eed9\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-74rb9" Apr 16 18:40:38.439423 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.439391 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcd4s\" (UniqueName: \"kubernetes.io/projected/67781650-8927-4cc0-9570-d23f75a9eed9-kube-api-access-bcd4s\") pod \"llmisvc-controller-manager-68cc5db7c4-74rb9\" (UID: \"67781650-8927-4cc0-9570-d23f75a9eed9\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-74rb9" Apr 16 18:40:38.439423 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.439436 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67781650-8927-4cc0-9570-d23f75a9eed9-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-74rb9\" (UID: \"67781650-8927-4cc0-9570-d23f75a9eed9\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-74rb9" Apr 16 18:40:38.442133 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.442110 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67781650-8927-4cc0-9570-d23f75a9eed9-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-74rb9\" (UID: \"67781650-8927-4cc0-9570-d23f75a9eed9\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-74rb9" Apr 16 18:40:38.448365 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.448339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcd4s\" (UniqueName: \"kubernetes.io/projected/67781650-8927-4cc0-9570-d23f75a9eed9-kube-api-access-bcd4s\") pod \"llmisvc-controller-manager-68cc5db7c4-74rb9\" (UID: \"67781650-8927-4cc0-9570-d23f75a9eed9\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-74rb9" Apr 16 18:40:38.517685 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.517598 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-zldsj\"" Apr 16 18:40:38.525438 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.525412 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-74rb9" Apr 16 18:40:38.536007 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.535980 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:40:38.536502 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.536481 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:40:38.538425 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.538400 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:40:38.539068 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.539048 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:40:38.669878 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:38.669853 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-74rb9"] Apr 16 18:40:38.672864 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:40:38.672831 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod67781650_8927_4cc0_9570_d23f75a9eed9.slice/crio-3c7c8c2aca6f3555cdda2a189d7ee8af073c50ea5cd172a114a7055f828bd330 WatchSource:0}: Error finding container 3c7c8c2aca6f3555cdda2a189d7ee8af073c50ea5cd172a114a7055f828bd330: Status 404 returned error can't find the container with id 3c7c8c2aca6f3555cdda2a189d7ee8af073c50ea5cd172a114a7055f828bd330 Apr 16 18:40:39.594055 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:39.594014 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-74rb9" event={"ID":"67781650-8927-4cc0-9570-d23f75a9eed9","Type":"ContainerStarted","Data":"3c7c8c2aca6f3555cdda2a189d7ee8af073c50ea5cd172a114a7055f828bd330"} Apr 16 18:40:41.604044 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:41.604003 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-74rb9" event={"ID":"67781650-8927-4cc0-9570-d23f75a9eed9","Type":"ContainerStarted","Data":"05471d77d19a718a3be7348843b11e04e112618f93dd647efea775e43950d020"} Apr 16 18:40:41.604429 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:41.604088 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-74rb9" Apr 16 18:40:41.623548 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:40:41.623485 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-74rb9" podStartSLOduration=1.774986362 podStartE2EDuration="3.623467431s" podCreationTimestamp="2026-04-16 18:40:38 +0000 UTC" firstStartedPulling="2026-04-16 18:40:38.674325685 +0000 UTC m=+600.619756377" lastFinishedPulling="2026-04-16 18:40:40.522806758 +0000 UTC m=+602.468237446" observedRunningTime="2026-04-16 18:40:41.621472681 +0000 UTC m=+603.566903404" watchObservedRunningTime="2026-04-16 18:40:41.623467431 +0000 UTC m=+603.568898145" Apr 16 18:41:12.610282 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:12.610252 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-74rb9" Apr 16 18:41:47.302816 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:47.302780 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-twwgv"] Apr 16 18:41:47.304898 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:47.304880 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-twwgv" Apr 16 18:41:47.307596 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:47.307566 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 18:41:47.307725 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:47.307698 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-nb6cn\"" Apr 16 18:41:47.324214 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:47.324189 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-twwgv"] Apr 16 18:41:47.430466 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:47.430427 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2k4d\" (UniqueName: \"kubernetes.io/projected/16312cc7-ba8e-496d-99ce-c12255f40602-kube-api-access-n2k4d\") pod \"model-serving-api-86f7b4b499-twwgv\" (UID: \"16312cc7-ba8e-496d-99ce-c12255f40602\") " pod="kserve/model-serving-api-86f7b4b499-twwgv" Apr 16 18:41:47.430628 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:47.430475 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/16312cc7-ba8e-496d-99ce-c12255f40602-tls-certs\") pod \"model-serving-api-86f7b4b499-twwgv\" (UID: \"16312cc7-ba8e-496d-99ce-c12255f40602\") " pod="kserve/model-serving-api-86f7b4b499-twwgv" Apr 16 18:41:47.531406 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:47.531373 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2k4d\" (UniqueName: \"kubernetes.io/projected/16312cc7-ba8e-496d-99ce-c12255f40602-kube-api-access-n2k4d\") pod \"model-serving-api-86f7b4b499-twwgv\" (UID: \"16312cc7-ba8e-496d-99ce-c12255f40602\") " pod="kserve/model-serving-api-86f7b4b499-twwgv" Apr 16 18:41:47.531578 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:47.531418 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/16312cc7-ba8e-496d-99ce-c12255f40602-tls-certs\") pod \"model-serving-api-86f7b4b499-twwgv\" (UID: \"16312cc7-ba8e-496d-99ce-c12255f40602\") " pod="kserve/model-serving-api-86f7b4b499-twwgv" Apr 16 18:41:47.533918 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:47.533882 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/16312cc7-ba8e-496d-99ce-c12255f40602-tls-certs\") pod \"model-serving-api-86f7b4b499-twwgv\" (UID: \"16312cc7-ba8e-496d-99ce-c12255f40602\") " pod="kserve/model-serving-api-86f7b4b499-twwgv" Apr 16 18:41:47.543415 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:47.543385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2k4d\" (UniqueName: \"kubernetes.io/projected/16312cc7-ba8e-496d-99ce-c12255f40602-kube-api-access-n2k4d\") pod \"model-serving-api-86f7b4b499-twwgv\" (UID: \"16312cc7-ba8e-496d-99ce-c12255f40602\") " pod="kserve/model-serving-api-86f7b4b499-twwgv" Apr 16 18:41:47.615247 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:47.615215 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-twwgv" Apr 16 18:41:47.965249 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:47.965163 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-twwgv"] Apr 16 18:41:47.968424 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:41:47.968388 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16312cc7_ba8e_496d_99ce_c12255f40602.slice/crio-06da3ead3238e7f50bd04cb76a09ae32f4760c4373fb212c1e6b6689140ceff7 WatchSource:0}: Error finding container 06da3ead3238e7f50bd04cb76a09ae32f4760c4373fb212c1e6b6689140ceff7: Status 404 returned error can't find the container with id 06da3ead3238e7f50bd04cb76a09ae32f4760c4373fb212c1e6b6689140ceff7 Apr 16 18:41:48.823257 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:48.823223 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-twwgv" event={"ID":"16312cc7-ba8e-496d-99ce-c12255f40602","Type":"ContainerStarted","Data":"06da3ead3238e7f50bd04cb76a09ae32f4760c4373fb212c1e6b6689140ceff7"} Apr 16 18:41:50.833590 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:50.833561 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-twwgv" event={"ID":"16312cc7-ba8e-496d-99ce-c12255f40602","Type":"ContainerStarted","Data":"3c725687c7f79e3421cd4d43992744016df5ac970d598bb7c7429a74b8ab712b"} Apr 16 18:41:50.834072 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:50.833677 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-twwgv" Apr 16 18:41:50.851589 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:41:50.851545 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-twwgv" podStartSLOduration=1.660714789 podStartE2EDuration="3.851532806s" podCreationTimestamp="2026-04-16 18:41:47 +0000 UTC" firstStartedPulling="2026-04-16 18:41:47.970245873 +0000 UTC m=+669.915676577" lastFinishedPulling="2026-04-16 18:41:50.161063887 +0000 UTC m=+672.106494594" observedRunningTime="2026-04-16 18:41:50.849479038 +0000 UTC m=+672.794909762" watchObservedRunningTime="2026-04-16 18:41:50.851532806 +0000 UTC m=+672.796963516" Apr 16 18:42:01.843176 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:01.843142 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-twwgv" Apr 16 18:42:23.161164 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:23.161080 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96"] Apr 16 18:42:23.169166 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:23.169139 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" Apr 16 18:42:23.171931 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:23.171905 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rqcg5\"" Apr 16 18:42:23.173933 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:23.173901 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96"] Apr 16 18:42:23.226891 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:23.226850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be5e84d4-0c1e-4d19-b7af-82f82a9f5f07-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96\" (UID: \"be5e84d4-0c1e-4d19-b7af-82f82a9f5f07\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" Apr 16 18:42:23.328174 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:23.328143 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be5e84d4-0c1e-4d19-b7af-82f82a9f5f07-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96\" (UID: \"be5e84d4-0c1e-4d19-b7af-82f82a9f5f07\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" Apr 16 18:42:23.328488 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:23.328470 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be5e84d4-0c1e-4d19-b7af-82f82a9f5f07-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96\" (UID: \"be5e84d4-0c1e-4d19-b7af-82f82a9f5f07\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" Apr 16 18:42:23.479627 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:23.479533 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" Apr 16 18:42:23.607544 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:23.605808 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96"] Apr 16 18:42:23.943244 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:23.943209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" event={"ID":"be5e84d4-0c1e-4d19-b7af-82f82a9f5f07","Type":"ContainerStarted","Data":"a344177248619f1080cd4a108fb63a0f941123fe322eccafad466d3878ff0e2a"} Apr 16 18:42:26.956069 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:26.956039 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" event={"ID":"be5e84d4-0c1e-4d19-b7af-82f82a9f5f07","Type":"ContainerStarted","Data":"fb49dd7257294991a5bad8a3b1856b577042075c8f581d54f97019812f20e81b"} Apr 16 18:42:30.970645 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:30.970607 2577 generic.go:358] "Generic (PLEG): container finished" podID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerID="fb49dd7257294991a5bad8a3b1856b577042075c8f581d54f97019812f20e81b" exitCode=0 Apr 16 18:42:30.971040 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:30.970682 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" event={"ID":"be5e84d4-0c1e-4d19-b7af-82f82a9f5f07","Type":"ContainerDied","Data":"fb49dd7257294991a5bad8a3b1856b577042075c8f581d54f97019812f20e81b"} Apr 16 18:42:44.032212 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:44.032173 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" event={"ID":"be5e84d4-0c1e-4d19-b7af-82f82a9f5f07","Type":"ContainerStarted","Data":"67636c009dbd28646e7ab049afc62ce4593b5a094e4836cec024693dfe88c181"} Apr 16 18:42:47.044282 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:47.044242 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" event={"ID":"be5e84d4-0c1e-4d19-b7af-82f82a9f5f07","Type":"ContainerStarted","Data":"c5ca3f920dcfdbe4e3b4ad4d9d05c124acc45b8040e1bcf48daebeaa4782e1f6"} Apr 16 18:42:47.044667 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:47.044505 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" Apr 16 18:42:47.046033 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:47.046007 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:42:47.068552 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:47.068505 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podStartSLOduration=0.751977289 podStartE2EDuration="24.068493455s" podCreationTimestamp="2026-04-16 18:42:23 +0000 UTC" firstStartedPulling="2026-04-16 18:42:23.60704998 +0000 UTC m=+705.552480668" lastFinishedPulling="2026-04-16 18:42:46.923566133 +0000 UTC m=+728.868996834" observedRunningTime="2026-04-16 18:42:47.068427452 +0000 UTC m=+729.013858173" watchObservedRunningTime="2026-04-16 18:42:47.068493455 +0000 UTC m=+729.013924164" Apr 16 18:42:48.047684 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:48.047643 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" Apr 16 18:42:48.048100 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:48.047724 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:42:48.048671 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:48.048646 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:42:49.051451 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:49.051414 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:42:49.051932 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:49.051715 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:42:59.052232 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:59.052189 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:42:59.052720 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:42:59.052611 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:43:09.051487 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:09.051441 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:43:09.051940 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:09.051914 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:43:19.052057 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:19.052009 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:43:19.052542 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:19.052513 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:43:29.051868 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:29.051817 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:43:29.052307 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:29.052285 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:43:39.051727 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:39.051681 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:43:39.052273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:39.052235 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:43:49.051949 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:49.051919 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" Apr 16 18:43:49.052426 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:49.052384 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" Apr 16 18:43:58.318833 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:58.318722 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96"] Apr 16 18:43:58.319304 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:58.319038 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" containerID="cri-o://67636c009dbd28646e7ab049afc62ce4593b5a094e4836cec024693dfe88c181" gracePeriod=30 Apr 16 18:43:58.319304 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:58.319136 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" containerID="cri-o://c5ca3f920dcfdbe4e3b4ad4d9d05c124acc45b8040e1bcf48daebeaa4782e1f6" gracePeriod=30 Apr 16 18:43:58.391392 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:58.391360 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2"] Apr 16 18:43:58.394881 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:58.394860 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" Apr 16 18:43:58.410334 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:58.410308 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2"] Apr 16 18:43:58.444703 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:58.444676 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/869ab569-4bfe-46d7-947b-59b8f47c7ae1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2\" (UID: \"869ab569-4bfe-46d7-947b-59b8f47c7ae1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" Apr 16 18:43:58.545089 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:58.545050 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/869ab569-4bfe-46d7-947b-59b8f47c7ae1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2\" (UID: \"869ab569-4bfe-46d7-947b-59b8f47c7ae1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" Apr 16 18:43:58.545411 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:58.545389 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/869ab569-4bfe-46d7-947b-59b8f47c7ae1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2\" (UID: \"869ab569-4bfe-46d7-947b-59b8f47c7ae1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" Apr 16 18:43:58.704568 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:58.704474 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" Apr 16 18:43:59.034509 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:59.034437 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2"] Apr 16 18:43:59.037713 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:43:59.037682 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod869ab569_4bfe_46d7_947b_59b8f47c7ae1.slice/crio-99ce385a97ccd78e758a65c24f56adc920bce31c5587dd2eb5a9671c48d131fc WatchSource:0}: Error finding container 99ce385a97ccd78e758a65c24f56adc920bce31c5587dd2eb5a9671c48d131fc: Status 404 returned error can't find the container with id 99ce385a97ccd78e758a65c24f56adc920bce31c5587dd2eb5a9671c48d131fc Apr 16 18:43:59.039584 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:59.039569 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:43:59.052185 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:59.052161 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:43:59.052962 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:59.052939 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:43:59.298150 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:59.298067 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" event={"ID":"869ab569-4bfe-46d7-947b-59b8f47c7ae1","Type":"ContainerStarted","Data":"75354495b34fc3d1abd8f1610cf4b9f3850a8073cb4d9a98ff93aa786f89aa1a"} Apr 16 18:43:59.298150 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:43:59.298103 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" event={"ID":"869ab569-4bfe-46d7-947b-59b8f47c7ae1","Type":"ContainerStarted","Data":"99ce385a97ccd78e758a65c24f56adc920bce31c5587dd2eb5a9671c48d131fc"} Apr 16 18:44:03.320418 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:03.320379 2577 generic.go:358] "Generic (PLEG): container finished" podID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerID="67636c009dbd28646e7ab049afc62ce4593b5a094e4836cec024693dfe88c181" exitCode=0 Apr 16 18:44:03.320857 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:03.320455 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" event={"ID":"be5e84d4-0c1e-4d19-b7af-82f82a9f5f07","Type":"ContainerDied","Data":"67636c009dbd28646e7ab049afc62ce4593b5a094e4836cec024693dfe88c181"} Apr 16 18:44:03.321934 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:03.321912 2577 generic.go:358] "Generic (PLEG): container finished" podID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerID="75354495b34fc3d1abd8f1610cf4b9f3850a8073cb4d9a98ff93aa786f89aa1a" exitCode=0 Apr 16 18:44:03.322046 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:03.321974 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" event={"ID":"869ab569-4bfe-46d7-947b-59b8f47c7ae1","Type":"ContainerDied","Data":"75354495b34fc3d1abd8f1610cf4b9f3850a8073cb4d9a98ff93aa786f89aa1a"} Apr 16 18:44:04.326322 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:04.326290 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" event={"ID":"869ab569-4bfe-46d7-947b-59b8f47c7ae1","Type":"ContainerStarted","Data":"f0d6463fdb2fc26abf8b8f45fe3ef46b110e026842c5e6e80d7d57c9834dc22d"} Apr 16 18:44:04.326728 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:04.326583 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" Apr 16 18:44:04.327715 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:04.327688 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 18:44:04.345041 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:04.344994 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" podStartSLOduration=6.344979834 podStartE2EDuration="6.344979834s" podCreationTimestamp="2026-04-16 18:43:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:44:04.343455742 +0000 UTC m=+806.288886452" watchObservedRunningTime="2026-04-16 18:44:04.344979834 +0000 UTC m=+806.290410547" Apr 16 18:44:05.330830 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:05.330791 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 18:44:09.051981 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:09.051938 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:44:09.052837 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:09.052809 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:44:15.331622 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:15.331580 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 18:44:19.052393 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:19.052347 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:44:19.052840 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:19.052559 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" Apr 16 18:44:19.053112 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:19.053088 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:44:19.053198 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:19.053187 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" Apr 16 18:44:25.331600 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:25.331559 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 18:44:28.414668 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:28.414632 2577 generic.go:358] "Generic (PLEG): container finished" podID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerID="c5ca3f920dcfdbe4e3b4ad4d9d05c124acc45b8040e1bcf48daebeaa4782e1f6" exitCode=0 Apr 16 18:44:28.415027 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:28.414701 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" event={"ID":"be5e84d4-0c1e-4d19-b7af-82f82a9f5f07","Type":"ContainerDied","Data":"c5ca3f920dcfdbe4e3b4ad4d9d05c124acc45b8040e1bcf48daebeaa4782e1f6"} Apr 16 18:44:28.456811 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:28.456782 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" Apr 16 18:44:28.612815 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:28.612714 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be5e84d4-0c1e-4d19-b7af-82f82a9f5f07-kserve-provision-location\") pod \"be5e84d4-0c1e-4d19-b7af-82f82a9f5f07\" (UID: \"be5e84d4-0c1e-4d19-b7af-82f82a9f5f07\") " Apr 16 18:44:28.613073 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:28.613052 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be5e84d4-0c1e-4d19-b7af-82f82a9f5f07-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" (UID: "be5e84d4-0c1e-4d19-b7af-82f82a9f5f07"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:28.713398 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:28.713359 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be5e84d4-0c1e-4d19-b7af-82f82a9f5f07-kserve-provision-location\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:44:29.420227 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:29.420194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" event={"ID":"be5e84d4-0c1e-4d19-b7af-82f82a9f5f07","Type":"ContainerDied","Data":"a344177248619f1080cd4a108fb63a0f941123fe322eccafad466d3878ff0e2a"} Apr 16 18:44:29.420651 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:29.420242 2577 scope.go:117] "RemoveContainer" containerID="c5ca3f920dcfdbe4e3b4ad4d9d05c124acc45b8040e1bcf48daebeaa4782e1f6" Apr 16 18:44:29.420651 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:29.420203 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96" Apr 16 18:44:29.429132 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:29.429097 2577 scope.go:117] "RemoveContainer" containerID="67636c009dbd28646e7ab049afc62ce4593b5a094e4836cec024693dfe88c181" Apr 16 18:44:29.438507 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:29.438479 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96"] Apr 16 18:44:29.438835 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:29.438815 2577 scope.go:117] "RemoveContainer" containerID="fb49dd7257294991a5bad8a3b1856b577042075c8f581d54f97019812f20e81b" Apr 16 18:44:29.445725 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:29.445699 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-8e196-predictor-f7dbdb484-d7w96"] Apr 16 18:44:30.622907 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:30.622866 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" path="/var/lib/kubelet/pods/be5e84d4-0c1e-4d19-b7af-82f82a9f5f07/volumes" Apr 16 18:44:35.330946 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:35.330902 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 18:44:45.331835 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:45.331787 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 18:44:55.331734 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:44:55.331686 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 18:45:05.331691 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:05.331645 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 18:45:15.331981 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:15.331944 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" Apr 16 18:45:38.564129 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.564100 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:45:38.566463 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.566432 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:45:38.566587 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.566436 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:45:38.568519 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.568500 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:45:38.660801 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.660758 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2"] Apr 16 18:45:38.661060 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.661023 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="kserve-container" containerID="cri-o://f0d6463fdb2fc26abf8b8f45fe3ef46b110e026842c5e6e80d7d57c9834dc22d" gracePeriod=30 Apr 16 18:45:38.709855 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.709821 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv"] Apr 16 18:45:38.710223 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.710209 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" Apr 16 18:45:38.710277 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.710224 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" Apr 16 18:45:38.710277 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.710234 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" Apr 16 18:45:38.710277 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.710240 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" Apr 16 18:45:38.710277 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.710258 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="storage-initializer" Apr 16 18:45:38.710277 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.710263 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="storage-initializer" Apr 16 18:45:38.710434 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.710319 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="agent" Apr 16 18:45:38.710434 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.710328 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="be5e84d4-0c1e-4d19-b7af-82f82a9f5f07" containerName="kserve-container" Apr 16 18:45:38.713323 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.713305 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" Apr 16 18:45:38.722368 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.721437 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv"] Apr 16 18:45:38.806622 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.806584 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aec133a5-e6b1-4385-b8a0-386b10bbb6ab-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv\" (UID: \"aec133a5-e6b1-4385-b8a0-386b10bbb6ab\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" Apr 16 18:45:38.907840 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.907806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aec133a5-e6b1-4385-b8a0-386b10bbb6ab-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv\" (UID: \"aec133a5-e6b1-4385-b8a0-386b10bbb6ab\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" Apr 16 18:45:38.908173 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:38.908153 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aec133a5-e6b1-4385-b8a0-386b10bbb6ab-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv\" (UID: \"aec133a5-e6b1-4385-b8a0-386b10bbb6ab\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" Apr 16 18:45:39.029147 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:39.029112 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" Apr 16 18:45:39.150693 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:39.150667 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv"] Apr 16 18:45:39.153081 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:45:39.153053 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaec133a5_e6b1_4385_b8a0_386b10bbb6ab.slice/crio-6983fc755b3632f7f0b825e33712f29f2ef5e455573f0d9d27f76657ede06b71 WatchSource:0}: Error finding container 6983fc755b3632f7f0b825e33712f29f2ef5e455573f0d9d27f76657ede06b71: Status 404 returned error can't find the container with id 6983fc755b3632f7f0b825e33712f29f2ef5e455573f0d9d27f76657ede06b71 Apr 16 18:45:39.656606 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:39.656571 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" event={"ID":"aec133a5-e6b1-4385-b8a0-386b10bbb6ab","Type":"ContainerStarted","Data":"ce638b6a458a99b79362a1f1f6e5170fcbd28900374b10028404fe333f174a29"} Apr 16 18:45:39.656606 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:39.656607 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" event={"ID":"aec133a5-e6b1-4385-b8a0-386b10bbb6ab","Type":"ContainerStarted","Data":"6983fc755b3632f7f0b825e33712f29f2ef5e455573f0d9d27f76657ede06b71"} Apr 16 18:45:42.905048 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:42.905023 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" Apr 16 18:45:42.941449 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:42.941416 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/869ab569-4bfe-46d7-947b-59b8f47c7ae1-kserve-provision-location\") pod \"869ab569-4bfe-46d7-947b-59b8f47c7ae1\" (UID: \"869ab569-4bfe-46d7-947b-59b8f47c7ae1\") " Apr 16 18:45:42.941762 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:42.941739 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/869ab569-4bfe-46d7-947b-59b8f47c7ae1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "869ab569-4bfe-46d7-947b-59b8f47c7ae1" (UID: "869ab569-4bfe-46d7-947b-59b8f47c7ae1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:45:43.043012 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.042929 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/869ab569-4bfe-46d7-947b-59b8f47c7ae1-kserve-provision-location\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:45:43.671891 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.671858 2577 generic.go:358] "Generic (PLEG): container finished" podID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerID="ce638b6a458a99b79362a1f1f6e5170fcbd28900374b10028404fe333f174a29" exitCode=0 Apr 16 18:45:43.672092 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.671938 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" event={"ID":"aec133a5-e6b1-4385-b8a0-386b10bbb6ab","Type":"ContainerDied","Data":"ce638b6a458a99b79362a1f1f6e5170fcbd28900374b10028404fe333f174a29"} Apr 16 18:45:43.673338 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.673315 2577 generic.go:358] "Generic (PLEG): container finished" podID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerID="f0d6463fdb2fc26abf8b8f45fe3ef46b110e026842c5e6e80d7d57c9834dc22d" exitCode=0 Apr 16 18:45:43.673447 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.673365 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" event={"ID":"869ab569-4bfe-46d7-947b-59b8f47c7ae1","Type":"ContainerDied","Data":"f0d6463fdb2fc26abf8b8f45fe3ef46b110e026842c5e6e80d7d57c9834dc22d"} Apr 16 18:45:43.673447 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.673373 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" Apr 16 18:45:43.673447 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.673390 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2" event={"ID":"869ab569-4bfe-46d7-947b-59b8f47c7ae1","Type":"ContainerDied","Data":"99ce385a97ccd78e758a65c24f56adc920bce31c5587dd2eb5a9671c48d131fc"} Apr 16 18:45:43.673447 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.673405 2577 scope.go:117] "RemoveContainer" containerID="f0d6463fdb2fc26abf8b8f45fe3ef46b110e026842c5e6e80d7d57c9834dc22d" Apr 16 18:45:43.681987 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.681971 2577 scope.go:117] "RemoveContainer" containerID="75354495b34fc3d1abd8f1610cf4b9f3850a8073cb4d9a98ff93aa786f89aa1a" Apr 16 18:45:43.689693 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.689575 2577 scope.go:117] "RemoveContainer" containerID="f0d6463fdb2fc26abf8b8f45fe3ef46b110e026842c5e6e80d7d57c9834dc22d" Apr 16 18:45:43.690217 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:45:43.690191 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0d6463fdb2fc26abf8b8f45fe3ef46b110e026842c5e6e80d7d57c9834dc22d\": container with ID starting with f0d6463fdb2fc26abf8b8f45fe3ef46b110e026842c5e6e80d7d57c9834dc22d not found: ID does not exist" containerID="f0d6463fdb2fc26abf8b8f45fe3ef46b110e026842c5e6e80d7d57c9834dc22d" Apr 16 18:45:43.690296 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.690226 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0d6463fdb2fc26abf8b8f45fe3ef46b110e026842c5e6e80d7d57c9834dc22d"} err="failed to get container status \"f0d6463fdb2fc26abf8b8f45fe3ef46b110e026842c5e6e80d7d57c9834dc22d\": rpc error: code = NotFound desc = could not find container \"f0d6463fdb2fc26abf8b8f45fe3ef46b110e026842c5e6e80d7d57c9834dc22d\": container with ID starting with f0d6463fdb2fc26abf8b8f45fe3ef46b110e026842c5e6e80d7d57c9834dc22d not found: ID does not exist" Apr 16 18:45:43.690296 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.690254 2577 scope.go:117] "RemoveContainer" containerID="75354495b34fc3d1abd8f1610cf4b9f3850a8073cb4d9a98ff93aa786f89aa1a" Apr 16 18:45:43.690562 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:45:43.690540 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75354495b34fc3d1abd8f1610cf4b9f3850a8073cb4d9a98ff93aa786f89aa1a\": container with ID starting with 75354495b34fc3d1abd8f1610cf4b9f3850a8073cb4d9a98ff93aa786f89aa1a not found: ID does not exist" containerID="75354495b34fc3d1abd8f1610cf4b9f3850a8073cb4d9a98ff93aa786f89aa1a" Apr 16 18:45:43.690629 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.690571 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75354495b34fc3d1abd8f1610cf4b9f3850a8073cb4d9a98ff93aa786f89aa1a"} err="failed to get container status \"75354495b34fc3d1abd8f1610cf4b9f3850a8073cb4d9a98ff93aa786f89aa1a\": rpc error: code = NotFound desc = could not find container \"75354495b34fc3d1abd8f1610cf4b9f3850a8073cb4d9a98ff93aa786f89aa1a\": container with ID starting with 75354495b34fc3d1abd8f1610cf4b9f3850a8073cb4d9a98ff93aa786f89aa1a not found: ID does not exist" Apr 16 18:45:43.703544 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.703454 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2"] Apr 16 18:45:43.708457 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:43.708427 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1a6d3-predictor-78bb7647f7-f9gs2"] Apr 16 18:45:44.621799 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:44.621743 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" path="/var/lib/kubelet/pods/869ab569-4bfe-46d7-947b-59b8f47c7ae1/volumes" Apr 16 18:45:44.679003 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:44.678964 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" event={"ID":"aec133a5-e6b1-4385-b8a0-386b10bbb6ab","Type":"ContainerStarted","Data":"c9e3d62831a3b96c84c5db924a9865e7e781b1145301ada7992185581ac502c1"} Apr 16 18:45:44.679272 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:44.679241 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" Apr 16 18:45:44.680709 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:44.680681 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:45:44.697257 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:44.697214 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" podStartSLOduration=6.697200931 podStartE2EDuration="6.697200931s" podCreationTimestamp="2026-04-16 18:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:45:44.695206311 +0000 UTC m=+906.640637022" watchObservedRunningTime="2026-04-16 18:45:44.697200931 +0000 UTC m=+906.642631714" Apr 16 18:45:45.682808 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:45.682756 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:45:55.683424 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:45:55.683381 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:46:05.683074 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:46:05.683031 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:46:15.683222 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:46:15.683176 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:46:25.683025 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:46:25.682978 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:46:35.682857 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:46:35.682811 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:46:45.683465 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:46:45.683422 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:46:46.614658 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:46:46.614622 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:46:56.618422 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:46:56.618354 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" Apr 16 18:47:19.077744 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:19.077711 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv"] Apr 16 18:47:19.078251 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:19.077985 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="kserve-container" containerID="cri-o://c9e3d62831a3b96c84c5db924a9865e7e781b1145301ada7992185581ac502c1" gracePeriod=30 Apr 16 18:47:23.317325 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:23.317302 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" Apr 16 18:47:23.496871 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:23.496792 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aec133a5-e6b1-4385-b8a0-386b10bbb6ab-kserve-provision-location\") pod \"aec133a5-e6b1-4385-b8a0-386b10bbb6ab\" (UID: \"aec133a5-e6b1-4385-b8a0-386b10bbb6ab\") " Apr 16 18:47:23.497089 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:23.497070 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aec133a5-e6b1-4385-b8a0-386b10bbb6ab-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aec133a5-e6b1-4385-b8a0-386b10bbb6ab" (UID: "aec133a5-e6b1-4385-b8a0-386b10bbb6ab"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:23.597584 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:23.597557 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aec133a5-e6b1-4385-b8a0-386b10bbb6ab-kserve-provision-location\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:47:24.014726 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:24.014687 2577 generic.go:358] "Generic (PLEG): container finished" podID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerID="c9e3d62831a3b96c84c5db924a9865e7e781b1145301ada7992185581ac502c1" exitCode=0 Apr 16 18:47:24.014911 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:24.014781 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" event={"ID":"aec133a5-e6b1-4385-b8a0-386b10bbb6ab","Type":"ContainerDied","Data":"c9e3d62831a3b96c84c5db924a9865e7e781b1145301ada7992185581ac502c1"} Apr 16 18:47:24.014911 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:24.014824 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" event={"ID":"aec133a5-e6b1-4385-b8a0-386b10bbb6ab","Type":"ContainerDied","Data":"6983fc755b3632f7f0b825e33712f29f2ef5e455573f0d9d27f76657ede06b71"} Apr 16 18:47:24.014911 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:24.014840 2577 scope.go:117] "RemoveContainer" containerID="c9e3d62831a3b96c84c5db924a9865e7e781b1145301ada7992185581ac502c1" Apr 16 18:47:24.014911 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:24.014792 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv" Apr 16 18:47:24.023394 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:24.023377 2577 scope.go:117] "RemoveContainer" containerID="ce638b6a458a99b79362a1f1f6e5170fcbd28900374b10028404fe333f174a29" Apr 16 18:47:24.030548 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:24.030530 2577 scope.go:117] "RemoveContainer" containerID="c9e3d62831a3b96c84c5db924a9865e7e781b1145301ada7992185581ac502c1" Apr 16 18:47:24.030800 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:47:24.030761 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9e3d62831a3b96c84c5db924a9865e7e781b1145301ada7992185581ac502c1\": container with ID starting with c9e3d62831a3b96c84c5db924a9865e7e781b1145301ada7992185581ac502c1 not found: ID does not exist" containerID="c9e3d62831a3b96c84c5db924a9865e7e781b1145301ada7992185581ac502c1" Apr 16 18:47:24.030852 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:24.030809 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e3d62831a3b96c84c5db924a9865e7e781b1145301ada7992185581ac502c1"} err="failed to get container status \"c9e3d62831a3b96c84c5db924a9865e7e781b1145301ada7992185581ac502c1\": rpc error: code = NotFound desc = could not find container \"c9e3d62831a3b96c84c5db924a9865e7e781b1145301ada7992185581ac502c1\": container with ID starting with c9e3d62831a3b96c84c5db924a9865e7e781b1145301ada7992185581ac502c1 not found: ID does not exist" Apr 16 18:47:24.030852 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:24.030828 2577 scope.go:117] "RemoveContainer" containerID="ce638b6a458a99b79362a1f1f6e5170fcbd28900374b10028404fe333f174a29" Apr 16 18:47:24.031029 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:47:24.031015 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce638b6a458a99b79362a1f1f6e5170fcbd28900374b10028404fe333f174a29\": container with ID starting with ce638b6a458a99b79362a1f1f6e5170fcbd28900374b10028404fe333f174a29 not found: ID does not exist" containerID="ce638b6a458a99b79362a1f1f6e5170fcbd28900374b10028404fe333f174a29" Apr 16 18:47:24.031072 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:24.031034 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce638b6a458a99b79362a1f1f6e5170fcbd28900374b10028404fe333f174a29"} err="failed to get container status \"ce638b6a458a99b79362a1f1f6e5170fcbd28900374b10028404fe333f174a29\": rpc error: code = NotFound desc = could not find container \"ce638b6a458a99b79362a1f1f6e5170fcbd28900374b10028404fe333f174a29\": container with ID starting with ce638b6a458a99b79362a1f1f6e5170fcbd28900374b10028404fe333f174a29 not found: ID does not exist" Apr 16 18:47:24.038750 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:24.038728 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv"] Apr 16 18:47:24.043010 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:24.042989 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-fd0ec-predictor-5b56948696-vj9fv"] Apr 16 18:47:24.617997 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:24.617966 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" path="/var/lib/kubelet/pods/aec133a5-e6b1-4385-b8a0-386b10bbb6ab/volumes" Apr 16 18:47:29.124471 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.124428 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp"] Apr 16 18:47:29.124826 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.124816 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="kserve-container" Apr 16 18:47:29.124886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.124830 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="kserve-container" Apr 16 18:47:29.124886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.124840 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="kserve-container" Apr 16 18:47:29.124886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.124845 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="kserve-container" Apr 16 18:47:29.124886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.124853 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="storage-initializer" Apr 16 18:47:29.124886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.124859 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="storage-initializer" Apr 16 18:47:29.124886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.124867 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="storage-initializer" Apr 16 18:47:29.124886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.124873 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="storage-initializer" Apr 16 18:47:29.125085 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.124933 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="869ab569-4bfe-46d7-947b-59b8f47c7ae1" containerName="kserve-container" Apr 16 18:47:29.125085 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.124942 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aec133a5-e6b1-4385-b8a0-386b10bbb6ab" containerName="kserve-container" Apr 16 18:47:29.127886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.127869 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" Apr 16 18:47:29.130117 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.130098 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rqcg5\"" Apr 16 18:47:29.139603 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.139577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170-kserve-provision-location\") pod \"isvc-logger-raw-79732-predictor-f97f4f564-lwgbp\" (UID: \"f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170\") " pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" Apr 16 18:47:29.142708 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.142687 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp"] Apr 16 18:47:29.240926 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.240890 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170-kserve-provision-location\") pod \"isvc-logger-raw-79732-predictor-f97f4f564-lwgbp\" (UID: \"f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170\") " pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" Apr 16 18:47:29.241236 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.241219 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170-kserve-provision-location\") pod \"isvc-logger-raw-79732-predictor-f97f4f564-lwgbp\" (UID: \"f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170\") " pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" Apr 16 18:47:29.438445 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.438371 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" Apr 16 18:47:29.564859 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:29.564831 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp"] Apr 16 18:47:29.567886 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:47:29.567857 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf03d9f60_6ac7_4bf8_9e7f_6d48ed4d1170.slice/crio-cbfbcc474b168b08d912d810d791f3d6131f3ec2c9c77d33e15525ae2939e59b WatchSource:0}: Error finding container cbfbcc474b168b08d912d810d791f3d6131f3ec2c9c77d33e15525ae2939e59b: Status 404 returned error can't find the container with id cbfbcc474b168b08d912d810d791f3d6131f3ec2c9c77d33e15525ae2939e59b Apr 16 18:47:30.035242 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:30.035205 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" event={"ID":"f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170","Type":"ContainerStarted","Data":"c7b527cf53f63f9ab31a9ce1a29266669926a98aefb2e7a875a2111a0ef31ffe"} Apr 16 18:47:30.035242 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:30.035245 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" event={"ID":"f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170","Type":"ContainerStarted","Data":"cbfbcc474b168b08d912d810d791f3d6131f3ec2c9c77d33e15525ae2939e59b"} Apr 16 18:47:34.050593 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:34.050559 2577 generic.go:358] "Generic (PLEG): container finished" podID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerID="c7b527cf53f63f9ab31a9ce1a29266669926a98aefb2e7a875a2111a0ef31ffe" exitCode=0 Apr 16 18:47:34.050985 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:34.050635 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" event={"ID":"f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170","Type":"ContainerDied","Data":"c7b527cf53f63f9ab31a9ce1a29266669926a98aefb2e7a875a2111a0ef31ffe"} Apr 16 18:47:35.056146 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:35.056114 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" event={"ID":"f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170","Type":"ContainerStarted","Data":"f9ba968e5e08cc76b386e67ef8364675e7bf1244d6df972a2b4661bbfbfa9666"} Apr 16 18:47:35.056146 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:35.056149 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" event={"ID":"f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170","Type":"ContainerStarted","Data":"e65fad0e774fb7ea83ecfe7765e27f84a33e12643f7759b5a73de825e91ef4f8"} Apr 16 18:47:35.056571 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:35.056528 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" Apr 16 18:47:35.056571 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:35.056561 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" Apr 16 18:47:35.057814 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:35.057780 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:47:35.058455 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:35.058434 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:47:35.086268 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:35.086226 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podStartSLOduration=6.086213414 podStartE2EDuration="6.086213414s" podCreationTimestamp="2026-04-16 18:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:47:35.085163964 +0000 UTC m=+1017.030594675" watchObservedRunningTime="2026-04-16 18:47:35.086213414 +0000 UTC m=+1017.031644124" Apr 16 18:47:36.060962 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:36.060926 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:47:36.061350 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:36.061222 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:47:46.061099 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:46.061047 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:47:46.061552 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:46.061527 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:47:56.061990 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:56.061939 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:47:56.062562 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:47:56.062466 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:06.061118 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:06.061069 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:48:06.061505 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:06.061469 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:16.061865 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:16.061812 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:48:16.062299 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:16.062274 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:26.061352 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:26.061247 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:48:26.061723 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:26.061699 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:36.061866 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:36.061809 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:48:36.062410 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:36.062283 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:46.061976 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:46.061947 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" Apr 16 18:48:46.062446 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:46.062012 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" Apr 16 18:48:54.395729 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:54.395682 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp"] Apr 16 18:48:54.396166 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:54.396113 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" containerID="cri-o://e65fad0e774fb7ea83ecfe7765e27f84a33e12643f7759b5a73de825e91ef4f8" gracePeriod=30 Apr 16 18:48:54.396235 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:54.396180 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" containerID="cri-o://f9ba968e5e08cc76b386e67ef8364675e7bf1244d6df972a2b4661bbfbfa9666" gracePeriod=30 Apr 16 18:48:54.430037 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:54.430006 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k"] Apr 16 18:48:54.433423 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:54.433404 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" Apr 16 18:48:54.456312 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:54.456287 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k"] Apr 16 18:48:54.567636 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:54.567603 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c2e06ef-387f-4446-b5cf-1edb2fae7e01-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k\" (UID: \"6c2e06ef-387f-4446-b5cf-1edb2fae7e01\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" Apr 16 18:48:54.669185 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:54.669083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c2e06ef-387f-4446-b5cf-1edb2fae7e01-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k\" (UID: \"6c2e06ef-387f-4446-b5cf-1edb2fae7e01\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" Apr 16 18:48:54.669487 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:54.669463 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c2e06ef-387f-4446-b5cf-1edb2fae7e01-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k\" (UID: \"6c2e06ef-387f-4446-b5cf-1edb2fae7e01\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" Apr 16 18:48:54.742979 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:54.742939 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" Apr 16 18:48:54.877533 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:54.877493 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k"] Apr 16 18:48:54.880452 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:48:54.880422 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c2e06ef_387f_4446_b5cf_1edb2fae7e01.slice/crio-fff444cbbc6376a6d6cb7b4bd9f50edc178e11923f541cdbb4be16c4845dda22 WatchSource:0}: Error finding container fff444cbbc6376a6d6cb7b4bd9f50edc178e11923f541cdbb4be16c4845dda22: Status 404 returned error can't find the container with id fff444cbbc6376a6d6cb7b4bd9f50edc178e11923f541cdbb4be16c4845dda22 Apr 16 18:48:55.333475 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:55.333441 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" event={"ID":"6c2e06ef-387f-4446-b5cf-1edb2fae7e01","Type":"ContainerStarted","Data":"f2c39a6f339f30ffe95f3978757a01ea678d333b4e81e6bba4b124d7c6df7729"} Apr 16 18:48:55.333475 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:55.333478 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" event={"ID":"6c2e06ef-387f-4446-b5cf-1edb2fae7e01","Type":"ContainerStarted","Data":"fff444cbbc6376a6d6cb7b4bd9f50edc178e11923f541cdbb4be16c4845dda22"} Apr 16 18:48:56.061888 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:56.061830 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:48:56.062315 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:56.062211 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:48:59.349300 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:59.349211 2577 generic.go:358] "Generic (PLEG): container finished" podID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerID="e65fad0e774fb7ea83ecfe7765e27f84a33e12643f7759b5a73de825e91ef4f8" exitCode=0 Apr 16 18:48:59.349300 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:59.349280 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" event={"ID":"f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170","Type":"ContainerDied","Data":"e65fad0e774fb7ea83ecfe7765e27f84a33e12643f7759b5a73de825e91ef4f8"} Apr 16 18:48:59.350572 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:59.350547 2577 generic.go:358] "Generic (PLEG): container finished" podID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerID="f2c39a6f339f30ffe95f3978757a01ea678d333b4e81e6bba4b124d7c6df7729" exitCode=0 Apr 16 18:48:59.350670 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:59.350624 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" event={"ID":"6c2e06ef-387f-4446-b5cf-1edb2fae7e01","Type":"ContainerDied","Data":"f2c39a6f339f30ffe95f3978757a01ea678d333b4e81e6bba4b124d7c6df7729"} Apr 16 18:48:59.351651 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:48:59.351637 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:49:00.355372 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:00.355335 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" event={"ID":"6c2e06ef-387f-4446-b5cf-1edb2fae7e01","Type":"ContainerStarted","Data":"24921c71973ac3e036a96f73ac59816ba82d9c6d933bf7f60afcf2a259e0a106"} Apr 16 18:49:00.355864 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:00.355682 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" Apr 16 18:49:00.356811 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:00.356785 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:49:00.372709 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:00.372660 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podStartSLOduration=6.37264377 podStartE2EDuration="6.37264377s" podCreationTimestamp="2026-04-16 18:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:49:00.371176613 +0000 UTC m=+1102.316607323" watchObservedRunningTime="2026-04-16 18:49:00.37264377 +0000 UTC m=+1102.318074482" Apr 16 18:49:01.358761 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:01.358724 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:49:06.061786 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:06.061722 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:49:06.062215 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:06.062050 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:49:11.359055 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:11.359005 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:49:16.061680 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:16.061628 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:49:16.062136 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:16.061808 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" Apr 16 18:49:16.062136 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:16.061980 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:49:16.062136 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:16.062064 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" Apr 16 18:49:20.888527 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:20.888492 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c9844876b-8xmmx"] Apr 16 18:49:20.892273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:20.892248 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:20.905639 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:20.905611 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c9844876b-8xmmx"] Apr 16 18:49:20.990120 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:20.990082 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b34c8247-0103-4976-8268-fedfabee9ba7-service-ca\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:20.990120 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:20.990125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b34c8247-0103-4976-8268-fedfabee9ba7-console-config\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:20.990322 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:20.990186 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b34c8247-0103-4976-8268-fedfabee9ba7-trusted-ca-bundle\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:20.990322 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:20.990225 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b34c8247-0103-4976-8268-fedfabee9ba7-oauth-serving-cert\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:20.990322 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:20.990242 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndx5x\" (UniqueName: \"kubernetes.io/projected/b34c8247-0103-4976-8268-fedfabee9ba7-kube-api-access-ndx5x\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:20.990322 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:20.990263 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b34c8247-0103-4976-8268-fedfabee9ba7-console-serving-cert\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:20.990538 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:20.990357 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b34c8247-0103-4976-8268-fedfabee9ba7-console-oauth-config\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.091519 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.091485 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b34c8247-0103-4976-8268-fedfabee9ba7-trusted-ca-bundle\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.091519 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.091522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b34c8247-0103-4976-8268-fedfabee9ba7-oauth-serving-cert\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.091744 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.091540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndx5x\" (UniqueName: \"kubernetes.io/projected/b34c8247-0103-4976-8268-fedfabee9ba7-kube-api-access-ndx5x\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.091744 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.091558 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b34c8247-0103-4976-8268-fedfabee9ba7-console-serving-cert\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.091744 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.091605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b34c8247-0103-4976-8268-fedfabee9ba7-console-oauth-config\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.091744 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.091666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b34c8247-0103-4976-8268-fedfabee9ba7-service-ca\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.091744 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.091703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b34c8247-0103-4976-8268-fedfabee9ba7-console-config\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.092386 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.092360 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b34c8247-0103-4976-8268-fedfabee9ba7-console-config\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.092483 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.092384 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b34c8247-0103-4976-8268-fedfabee9ba7-oauth-serving-cert\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.092483 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.092410 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b34c8247-0103-4976-8268-fedfabee9ba7-trusted-ca-bundle\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.092611 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.092587 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b34c8247-0103-4976-8268-fedfabee9ba7-service-ca\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.094273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.094252 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b34c8247-0103-4976-8268-fedfabee9ba7-console-oauth-config\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.094273 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.094270 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b34c8247-0103-4976-8268-fedfabee9ba7-console-serving-cert\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.099737 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.099712 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndx5x\" (UniqueName: \"kubernetes.io/projected/b34c8247-0103-4976-8268-fedfabee9ba7-kube-api-access-ndx5x\") pod \"console-c9844876b-8xmmx\" (UID: \"b34c8247-0103-4976-8268-fedfabee9ba7\") " pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.203121 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.203038 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:21.359426 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.359381 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:49:21.539078 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:21.539054 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c9844876b-8xmmx"] Apr 16 18:49:21.540756 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:49:21.540720 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb34c8247_0103_4976_8268_fedfabee9ba7.slice/crio-77f416869a41028f14b1a607d5956e33768aaea4ae82e500f4c24a817882bcb4 WatchSource:0}: Error finding container 77f416869a41028f14b1a607d5956e33768aaea4ae82e500f4c24a817882bcb4: Status 404 returned error can't find the container with id 77f416869a41028f14b1a607d5956e33768aaea4ae82e500f4c24a817882bcb4 Apr 16 18:49:22.433667 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:22.433630 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c9844876b-8xmmx" event={"ID":"b34c8247-0103-4976-8268-fedfabee9ba7","Type":"ContainerStarted","Data":"f8f1ab7779f50ea56184ad684caebe65bca2fa8f9145d8dab945d0b63cbabca8"} Apr 16 18:49:22.433667 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:22.433669 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c9844876b-8xmmx" event={"ID":"b34c8247-0103-4976-8268-fedfabee9ba7","Type":"ContainerStarted","Data":"77f416869a41028f14b1a607d5956e33768aaea4ae82e500f4c24a817882bcb4"} Apr 16 18:49:22.453452 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:22.453408 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c9844876b-8xmmx" podStartSLOduration=2.453396506 podStartE2EDuration="2.453396506s" podCreationTimestamp="2026-04-16 18:49:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:49:22.452268603 +0000 UTC m=+1124.397699313" watchObservedRunningTime="2026-04-16 18:49:22.453396506 +0000 UTC m=+1124.398827215" Apr 16 18:49:24.580877 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:24.580852 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" Apr 16 18:49:24.623804 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:24.623757 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170-kserve-provision-location\") pod \"f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170\" (UID: \"f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170\") " Apr 16 18:49:24.624081 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:24.624054 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" (UID: "f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:49:24.725072 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:24.724989 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170-kserve-provision-location\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:49:25.445574 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.445539 2577 generic.go:358] "Generic (PLEG): container finished" podID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerID="f9ba968e5e08cc76b386e67ef8364675e7bf1244d6df972a2b4661bbfbfa9666" exitCode=137 Apr 16 18:49:25.445784 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.445590 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" event={"ID":"f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170","Type":"ContainerDied","Data":"f9ba968e5e08cc76b386e67ef8364675e7bf1244d6df972a2b4661bbfbfa9666"} Apr 16 18:49:25.445784 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.445617 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" Apr 16 18:49:25.445784 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.445625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp" event={"ID":"f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170","Type":"ContainerDied","Data":"cbfbcc474b168b08d912d810d791f3d6131f3ec2c9c77d33e15525ae2939e59b"} Apr 16 18:49:25.445784 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.445644 2577 scope.go:117] "RemoveContainer" containerID="f9ba968e5e08cc76b386e67ef8364675e7bf1244d6df972a2b4661bbfbfa9666" Apr 16 18:49:25.453813 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.453788 2577 scope.go:117] "RemoveContainer" containerID="e65fad0e774fb7ea83ecfe7765e27f84a33e12643f7759b5a73de825e91ef4f8" Apr 16 18:49:25.460989 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.460971 2577 scope.go:117] "RemoveContainer" containerID="c7b527cf53f63f9ab31a9ce1a29266669926a98aefb2e7a875a2111a0ef31ffe" Apr 16 18:49:25.468443 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.468427 2577 scope.go:117] "RemoveContainer" containerID="f9ba968e5e08cc76b386e67ef8364675e7bf1244d6df972a2b4661bbfbfa9666" Apr 16 18:49:25.468696 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:49:25.468669 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ba968e5e08cc76b386e67ef8364675e7bf1244d6df972a2b4661bbfbfa9666\": container with ID starting with f9ba968e5e08cc76b386e67ef8364675e7bf1244d6df972a2b4661bbfbfa9666 not found: ID does not exist" containerID="f9ba968e5e08cc76b386e67ef8364675e7bf1244d6df972a2b4661bbfbfa9666" Apr 16 18:49:25.468752 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.468708 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ba968e5e08cc76b386e67ef8364675e7bf1244d6df972a2b4661bbfbfa9666"} err="failed to get container status \"f9ba968e5e08cc76b386e67ef8364675e7bf1244d6df972a2b4661bbfbfa9666\": rpc error: code = NotFound desc = could not find container \"f9ba968e5e08cc76b386e67ef8364675e7bf1244d6df972a2b4661bbfbfa9666\": container with ID starting with f9ba968e5e08cc76b386e67ef8364675e7bf1244d6df972a2b4661bbfbfa9666 not found: ID does not exist" Apr 16 18:49:25.468752 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.468732 2577 scope.go:117] "RemoveContainer" containerID="e65fad0e774fb7ea83ecfe7765e27f84a33e12643f7759b5a73de825e91ef4f8" Apr 16 18:49:25.469000 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:49:25.468979 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e65fad0e774fb7ea83ecfe7765e27f84a33e12643f7759b5a73de825e91ef4f8\": container with ID starting with e65fad0e774fb7ea83ecfe7765e27f84a33e12643f7759b5a73de825e91ef4f8 not found: ID does not exist" containerID="e65fad0e774fb7ea83ecfe7765e27f84a33e12643f7759b5a73de825e91ef4f8" Apr 16 18:49:25.469079 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.469009 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e65fad0e774fb7ea83ecfe7765e27f84a33e12643f7759b5a73de825e91ef4f8"} err="failed to get container status \"e65fad0e774fb7ea83ecfe7765e27f84a33e12643f7759b5a73de825e91ef4f8\": rpc error: code = NotFound desc = could not find container \"e65fad0e774fb7ea83ecfe7765e27f84a33e12643f7759b5a73de825e91ef4f8\": container with ID starting with e65fad0e774fb7ea83ecfe7765e27f84a33e12643f7759b5a73de825e91ef4f8 not found: ID does not exist" Apr 16 18:49:25.469079 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.469026 2577 scope.go:117] "RemoveContainer" containerID="c7b527cf53f63f9ab31a9ce1a29266669926a98aefb2e7a875a2111a0ef31ffe" Apr 16 18:49:25.469079 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.468986 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp"] Apr 16 18:49:25.469219 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:49:25.469205 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7b527cf53f63f9ab31a9ce1a29266669926a98aefb2e7a875a2111a0ef31ffe\": container with ID starting with c7b527cf53f63f9ab31a9ce1a29266669926a98aefb2e7a875a2111a0ef31ffe not found: ID does not exist" containerID="c7b527cf53f63f9ab31a9ce1a29266669926a98aefb2e7a875a2111a0ef31ffe" Apr 16 18:49:25.469255 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.469223 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b527cf53f63f9ab31a9ce1a29266669926a98aefb2e7a875a2111a0ef31ffe"} err="failed to get container status \"c7b527cf53f63f9ab31a9ce1a29266669926a98aefb2e7a875a2111a0ef31ffe\": rpc error: code = NotFound desc = could not find container \"c7b527cf53f63f9ab31a9ce1a29266669926a98aefb2e7a875a2111a0ef31ffe\": container with ID starting with c7b527cf53f63f9ab31a9ce1a29266669926a98aefb2e7a875a2111a0ef31ffe not found: ID does not exist" Apr 16 18:49:25.476279 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:25.476260 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-79732-predictor-f97f4f564-lwgbp"] Apr 16 18:49:26.618121 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:26.618088 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" path="/var/lib/kubelet/pods/f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170/volumes" Apr 16 18:49:31.204019 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:31.203983 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:31.204461 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:31.204031 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:31.208533 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:31.208509 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:31.359199 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:31.359159 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:49:31.472907 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:31.472829 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c9844876b-8xmmx" Apr 16 18:49:31.538221 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:31.538188 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b486f9796-f9bkd"] Apr 16 18:49:41.358944 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:41.358902 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:49:51.359036 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:51.358947 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:49:56.559828 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.559752 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-b486f9796-f9bkd" podUID="1e738a81-6ffa-4f25-aba7-a90f9e0b31e4" containerName="console" containerID="cri-o://9ea8c4a012a3c6e2a680722ba48a3e727db9bbe7956edc3ec744bd434fc61d64" gracePeriod=15 Apr 16 18:49:56.804831 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.804810 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b486f9796-f9bkd_1e738a81-6ffa-4f25-aba7-a90f9e0b31e4/console/0.log" Apr 16 18:49:56.804944 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.804868 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:49:56.923626 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.923594 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-oauth-config\") pod \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " Apr 16 18:49:56.923835 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.923638 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4b79\" (UniqueName: \"kubernetes.io/projected/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-kube-api-access-n4b79\") pod \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " Apr 16 18:49:56.923835 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.923657 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-serving-cert\") pod \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " Apr 16 18:49:56.923835 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.923738 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-service-ca\") pod \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " Apr 16 18:49:56.923835 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.923789 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-oauth-serving-cert\") pod \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " Apr 16 18:49:56.923835 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.923835 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-trusted-ca-bundle\") pod \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " Apr 16 18:49:56.924109 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.923889 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-config\") pod \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\" (UID: \"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4\") " Apr 16 18:49:56.924230 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.924199 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-service-ca" (OuterVolumeSpecName: "service-ca") pod "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4" (UID: "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:49:56.924580 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.924523 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4" (UID: "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:49:56.924580 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.924528 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4" (UID: "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:49:56.924730 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.924588 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-config" (OuterVolumeSpecName: "console-config") pod "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4" (UID: "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:49:56.926089 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.926063 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4" (UID: "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:49:56.926190 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.926154 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4" (UID: "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:49:56.926386 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:56.926365 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-kube-api-access-n4b79" (OuterVolumeSpecName: "kube-api-access-n4b79") pod "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4" (UID: "1e738a81-6ffa-4f25-aba7-a90f9e0b31e4"). InnerVolumeSpecName "kube-api-access-n4b79". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:49:57.025086 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.025054 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-service-ca\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:49:57.025086 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.025080 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-oauth-serving-cert\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:49:57.025086 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.025090 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-trusted-ca-bundle\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:49:57.025292 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.025100 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-config\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:49:57.025292 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.025108 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-oauth-config\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:49:57.025292 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.025116 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n4b79\" (UniqueName: \"kubernetes.io/projected/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-kube-api-access-n4b79\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:49:57.025292 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.025125 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4-console-serving-cert\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:49:57.558169 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.558142 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b486f9796-f9bkd_1e738a81-6ffa-4f25-aba7-a90f9e0b31e4/console/0.log" Apr 16 18:49:57.558345 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.558182 2577 generic.go:358] "Generic (PLEG): container finished" podID="1e738a81-6ffa-4f25-aba7-a90f9e0b31e4" containerID="9ea8c4a012a3c6e2a680722ba48a3e727db9bbe7956edc3ec744bd434fc61d64" exitCode=2 Apr 16 18:49:57.558345 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.558213 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b486f9796-f9bkd" event={"ID":"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4","Type":"ContainerDied","Data":"9ea8c4a012a3c6e2a680722ba48a3e727db9bbe7956edc3ec744bd434fc61d64"} Apr 16 18:49:57.558345 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.558262 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b486f9796-f9bkd" event={"ID":"1e738a81-6ffa-4f25-aba7-a90f9e0b31e4","Type":"ContainerDied","Data":"e7e0b92e0695dad4a1705f8b6cb9aef18a5e4e1fe9f8f7d8978e654b1b4adeb7"} Apr 16 18:49:57.558345 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.558282 2577 scope.go:117] "RemoveContainer" containerID="9ea8c4a012a3c6e2a680722ba48a3e727db9bbe7956edc3ec744bd434fc61d64" Apr 16 18:49:57.558345 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.558286 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b486f9796-f9bkd" Apr 16 18:49:57.567427 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.567289 2577 scope.go:117] "RemoveContainer" containerID="9ea8c4a012a3c6e2a680722ba48a3e727db9bbe7956edc3ec744bd434fc61d64" Apr 16 18:49:57.567657 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:49:57.567539 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea8c4a012a3c6e2a680722ba48a3e727db9bbe7956edc3ec744bd434fc61d64\": container with ID starting with 9ea8c4a012a3c6e2a680722ba48a3e727db9bbe7956edc3ec744bd434fc61d64 not found: ID does not exist" containerID="9ea8c4a012a3c6e2a680722ba48a3e727db9bbe7956edc3ec744bd434fc61d64" Apr 16 18:49:57.567657 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.567562 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea8c4a012a3c6e2a680722ba48a3e727db9bbe7956edc3ec744bd434fc61d64"} err="failed to get container status \"9ea8c4a012a3c6e2a680722ba48a3e727db9bbe7956edc3ec744bd434fc61d64\": rpc error: code = NotFound desc = could not find container \"9ea8c4a012a3c6e2a680722ba48a3e727db9bbe7956edc3ec744bd434fc61d64\": container with ID starting with 9ea8c4a012a3c6e2a680722ba48a3e727db9bbe7956edc3ec744bd434fc61d64 not found: ID does not exist" Apr 16 18:49:57.579592 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.579567 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b486f9796-f9bkd"] Apr 16 18:49:57.583650 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:57.583626 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-b486f9796-f9bkd"] Apr 16 18:49:58.618107 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:49:58.618078 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e738a81-6ffa-4f25-aba7-a90f9e0b31e4" path="/var/lib/kubelet/pods/1e738a81-6ffa-4f25-aba7-a90f9e0b31e4/volumes" Apr 16 18:50:01.359564 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:50:01.359519 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:50:03.613690 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:50:03.613648 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:50:13.614282 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:50:13.614239 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:50:23.614636 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:50:23.614591 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:50:33.614628 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:50:33.614583 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:50:38.589371 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:50:38.589339 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:50:38.591407 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:50:38.591382 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:50:38.591538 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:50:38.591434 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:50:38.593561 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:50:38.593542 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:50:43.614688 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:50:43.614643 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:50:53.614481 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:50:53.614433 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:51:03.614317 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:03.614274 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:51:07.614010 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:07.613968 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:51:17.614942 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:17.614912 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" Apr 16 18:51:24.581338 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.581261 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k"] Apr 16 18:51:24.581726 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.581624 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" containerID="cri-o://24921c71973ac3e036a96f73ac59816ba82d9c6d933bf7f60afcf2a259e0a106" gracePeriod=30 Apr 16 18:51:24.658069 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.658032 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8"] Apr 16 18:51:24.658381 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.658369 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="storage-initializer" Apr 16 18:51:24.658426 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.658383 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="storage-initializer" Apr 16 18:51:24.658426 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.658393 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" Apr 16 18:51:24.658426 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.658399 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" Apr 16 18:51:24.658426 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.658414 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" Apr 16 18:51:24.658426 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.658419 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" Apr 16 18:51:24.658426 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.658427 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e738a81-6ffa-4f25-aba7-a90f9e0b31e4" containerName="console" Apr 16 18:51:24.658601 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.658433 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e738a81-6ffa-4f25-aba7-a90f9e0b31e4" containerName="console" Apr 16 18:51:24.658601 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.658485 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="agent" Apr 16 18:51:24.658601 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.658494 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f03d9f60-6ac7-4bf8-9e7f-6d48ed4d1170" containerName="kserve-container" Apr 16 18:51:24.658601 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.658501 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e738a81-6ffa-4f25-aba7-a90f9e0b31e4" containerName="console" Apr 16 18:51:24.661537 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.661520 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" Apr 16 18:51:24.672071 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.672044 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8"] Apr 16 18:51:24.773846 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.773801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da0daa06-f6df-4f51-95c7-98d163d34200-kserve-provision-location\") pod \"isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8\" (UID: \"da0daa06-f6df-4f51-95c7-98d163d34200\") " pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" Apr 16 18:51:24.874728 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.874649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da0daa06-f6df-4f51-95c7-98d163d34200-kserve-provision-location\") pod \"isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8\" (UID: \"da0daa06-f6df-4f51-95c7-98d163d34200\") " pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" Apr 16 18:51:24.875034 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.875017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da0daa06-f6df-4f51-95c7-98d163d34200-kserve-provision-location\") pod \"isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8\" (UID: \"da0daa06-f6df-4f51-95c7-98d163d34200\") " pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" Apr 16 18:51:24.971950 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:24.971921 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" Apr 16 18:51:25.298924 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:25.298892 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8"] Apr 16 18:51:25.300992 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:51:25.300966 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda0daa06_f6df_4f51_95c7_98d163d34200.slice/crio-c0c0c6b406a31ed84d04573a53d0b8e7bdc18b62d62a14fe5091f59da8ece15a WatchSource:0}: Error finding container c0c0c6b406a31ed84d04573a53d0b8e7bdc18b62d62a14fe5091f59da8ece15a: Status 404 returned error can't find the container with id c0c0c6b406a31ed84d04573a53d0b8e7bdc18b62d62a14fe5091f59da8ece15a Apr 16 18:51:25.857808 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:25.857745 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" event={"ID":"da0daa06-f6df-4f51-95c7-98d163d34200","Type":"ContainerStarted","Data":"3f4ebcd5985c3e3236406c715756ea3e45843fc95fcf19fa657470528740071d"} Apr 16 18:51:25.858169 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:25.857817 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" event={"ID":"da0daa06-f6df-4f51-95c7-98d163d34200","Type":"ContainerStarted","Data":"c0c0c6b406a31ed84d04573a53d0b8e7bdc18b62d62a14fe5091f59da8ece15a"} Apr 16 18:51:27.613982 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:27.613945 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:51:29.871846 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:29.871811 2577 generic.go:358] "Generic (PLEG): container finished" podID="da0daa06-f6df-4f51-95c7-98d163d34200" containerID="3f4ebcd5985c3e3236406c715756ea3e45843fc95fcf19fa657470528740071d" exitCode=0 Apr 16 18:51:29.872195 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:29.871884 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" event={"ID":"da0daa06-f6df-4f51-95c7-98d163d34200","Type":"ContainerDied","Data":"3f4ebcd5985c3e3236406c715756ea3e45843fc95fcf19fa657470528740071d"} Apr 16 18:51:30.876704 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:30.876668 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" event={"ID":"da0daa06-f6df-4f51-95c7-98d163d34200","Type":"ContainerStarted","Data":"ca763b51078d6e06810fb231fc22fe4b32bd8200e0a3e369889942cb9359ca15"} Apr 16 18:51:30.877183 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:30.877087 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" Apr 16 18:51:30.878473 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:30.878448 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:51:30.891705 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:30.891659 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" podStartSLOduration=6.891645381 podStartE2EDuration="6.891645381s" podCreationTimestamp="2026-04-16 18:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:51:30.891611825 +0000 UTC m=+1252.837042529" watchObservedRunningTime="2026-04-16 18:51:30.891645381 +0000 UTC m=+1252.837076092" Apr 16 18:51:31.879693 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:31.879659 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:51:33.626631 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.626600 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" Apr 16 18:51:33.747466 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.747390 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c2e06ef-387f-4446-b5cf-1edb2fae7e01-kserve-provision-location\") pod \"6c2e06ef-387f-4446-b5cf-1edb2fae7e01\" (UID: \"6c2e06ef-387f-4446-b5cf-1edb2fae7e01\") " Apr 16 18:51:33.747688 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.747665 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c2e06ef-387f-4446-b5cf-1edb2fae7e01-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6c2e06ef-387f-4446-b5cf-1edb2fae7e01" (UID: "6c2e06ef-387f-4446-b5cf-1edb2fae7e01"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:51:33.848134 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.848086 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c2e06ef-387f-4446-b5cf-1edb2fae7e01-kserve-provision-location\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:51:33.886845 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.886814 2577 generic.go:358] "Generic (PLEG): container finished" podID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerID="24921c71973ac3e036a96f73ac59816ba82d9c6d933bf7f60afcf2a259e0a106" exitCode=0 Apr 16 18:51:33.887012 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.886888 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" Apr 16 18:51:33.887012 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.886901 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" event={"ID":"6c2e06ef-387f-4446-b5cf-1edb2fae7e01","Type":"ContainerDied","Data":"24921c71973ac3e036a96f73ac59816ba82d9c6d933bf7f60afcf2a259e0a106"} Apr 16 18:51:33.887012 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.886953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k" event={"ID":"6c2e06ef-387f-4446-b5cf-1edb2fae7e01","Type":"ContainerDied","Data":"fff444cbbc6376a6d6cb7b4bd9f50edc178e11923f541cdbb4be16c4845dda22"} Apr 16 18:51:33.887012 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.886974 2577 scope.go:117] "RemoveContainer" containerID="24921c71973ac3e036a96f73ac59816ba82d9c6d933bf7f60afcf2a259e0a106" Apr 16 18:51:33.895608 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.895594 2577 scope.go:117] "RemoveContainer" containerID="f2c39a6f339f30ffe95f3978757a01ea678d333b4e81e6bba4b124d7c6df7729" Apr 16 18:51:33.902583 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.902566 2577 scope.go:117] "RemoveContainer" containerID="24921c71973ac3e036a96f73ac59816ba82d9c6d933bf7f60afcf2a259e0a106" Apr 16 18:51:33.902852 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:51:33.902835 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24921c71973ac3e036a96f73ac59816ba82d9c6d933bf7f60afcf2a259e0a106\": container with ID starting with 24921c71973ac3e036a96f73ac59816ba82d9c6d933bf7f60afcf2a259e0a106 not found: ID does not exist" containerID="24921c71973ac3e036a96f73ac59816ba82d9c6d933bf7f60afcf2a259e0a106" Apr 16 18:51:33.902912 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.902861 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24921c71973ac3e036a96f73ac59816ba82d9c6d933bf7f60afcf2a259e0a106"} err="failed to get container status \"24921c71973ac3e036a96f73ac59816ba82d9c6d933bf7f60afcf2a259e0a106\": rpc error: code = NotFound desc = could not find container \"24921c71973ac3e036a96f73ac59816ba82d9c6d933bf7f60afcf2a259e0a106\": container with ID starting with 24921c71973ac3e036a96f73ac59816ba82d9c6d933bf7f60afcf2a259e0a106 not found: ID does not exist" Apr 16 18:51:33.902912 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.902877 2577 scope.go:117] "RemoveContainer" containerID="f2c39a6f339f30ffe95f3978757a01ea678d333b4e81e6bba4b124d7c6df7729" Apr 16 18:51:33.903077 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:51:33.903061 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2c39a6f339f30ffe95f3978757a01ea678d333b4e81e6bba4b124d7c6df7729\": container with ID starting with f2c39a6f339f30ffe95f3978757a01ea678d333b4e81e6bba4b124d7c6df7729 not found: ID does not exist" containerID="f2c39a6f339f30ffe95f3978757a01ea678d333b4e81e6bba4b124d7c6df7729" Apr 16 18:51:33.903123 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.903080 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c39a6f339f30ffe95f3978757a01ea678d333b4e81e6bba4b124d7c6df7729"} err="failed to get container status \"f2c39a6f339f30ffe95f3978757a01ea678d333b4e81e6bba4b124d7c6df7729\": rpc error: code = NotFound desc = could not find container \"f2c39a6f339f30ffe95f3978757a01ea678d333b4e81e6bba4b124d7c6df7729\": container with ID starting with f2c39a6f339f30ffe95f3978757a01ea678d333b4e81e6bba4b124d7c6df7729 not found: ID does not exist" Apr 16 18:51:33.906932 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.906901 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k"] Apr 16 18:51:33.910643 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:33.910620 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-858f6-predictor-58964bb9c7-hxp4k"] Apr 16 18:51:34.618688 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:34.618653 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" path="/var/lib/kubelet/pods/6c2e06ef-387f-4446-b5cf-1edb2fae7e01/volumes" Apr 16 18:51:41.880435 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:41.880388 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:51:51.879965 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:51:51.879926 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:52:01.880185 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:01.880134 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:52:11.879980 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:11.879937 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:52:21.880471 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:21.880435 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:52:31.880163 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:31.880119 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:52:41.880983 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:41.880948 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" Apr 16 18:52:44.799157 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:44.799119 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm"] Apr 16 18:52:44.799586 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:44.799471 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="storage-initializer" Apr 16 18:52:44.799586 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:44.799482 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="storage-initializer" Apr 16 18:52:44.799586 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:44.799490 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" Apr 16 18:52:44.799586 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:44.799496 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" Apr 16 18:52:44.799586 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:44.799563 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c2e06ef-387f-4446-b5cf-1edb2fae7e01" containerName="kserve-container" Apr 16 18:52:44.802610 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:44.802594 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" Apr 16 18:52:44.805139 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:44.805117 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-2e895d\"" Apr 16 18:52:44.805264 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:44.805186 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 18:52:44.806797 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:44.806016 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-2e895d-dockercfg-vbp9x\"" Apr 16 18:52:44.814397 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:44.814367 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm"] Apr 16 18:52:44.934601 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:44.934560 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/107a8f2e-ba63-42a2-94b3-c48d52b96cb3-cabundle-cert\") pod \"isvc-secondary-2e895d-predictor-6b4f9f87-2cczm\" (UID: \"107a8f2e-ba63-42a2-94b3-c48d52b96cb3\") " pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" Apr 16 18:52:44.934797 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:44.934688 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/107a8f2e-ba63-42a2-94b3-c48d52b96cb3-kserve-provision-location\") pod \"isvc-secondary-2e895d-predictor-6b4f9f87-2cczm\" (UID: \"107a8f2e-ba63-42a2-94b3-c48d52b96cb3\") " pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" Apr 16 18:52:45.035670 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:45.035634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/107a8f2e-ba63-42a2-94b3-c48d52b96cb3-kserve-provision-location\") pod \"isvc-secondary-2e895d-predictor-6b4f9f87-2cczm\" (UID: \"107a8f2e-ba63-42a2-94b3-c48d52b96cb3\") " pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" Apr 16 18:52:45.035670 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:45.035677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/107a8f2e-ba63-42a2-94b3-c48d52b96cb3-cabundle-cert\") pod \"isvc-secondary-2e895d-predictor-6b4f9f87-2cczm\" (UID: \"107a8f2e-ba63-42a2-94b3-c48d52b96cb3\") " pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" Apr 16 18:52:45.036018 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:45.035998 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/107a8f2e-ba63-42a2-94b3-c48d52b96cb3-kserve-provision-location\") pod \"isvc-secondary-2e895d-predictor-6b4f9f87-2cczm\" (UID: \"107a8f2e-ba63-42a2-94b3-c48d52b96cb3\") " pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" Apr 16 18:52:45.036256 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:45.036239 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/107a8f2e-ba63-42a2-94b3-c48d52b96cb3-cabundle-cert\") pod \"isvc-secondary-2e895d-predictor-6b4f9f87-2cczm\" (UID: \"107a8f2e-ba63-42a2-94b3-c48d52b96cb3\") " pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" Apr 16 18:52:45.120260 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:45.120227 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" Apr 16 18:52:45.243729 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:45.243607 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm"] Apr 16 18:52:45.246734 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:52:45.246707 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod107a8f2e_ba63_42a2_94b3_c48d52b96cb3.slice/crio-f40bd958b99e41fdd11a967d0234f64cba248452a38a7a04e7104fea9a605ae4 WatchSource:0}: Error finding container f40bd958b99e41fdd11a967d0234f64cba248452a38a7a04e7104fea9a605ae4: Status 404 returned error can't find the container with id f40bd958b99e41fdd11a967d0234f64cba248452a38a7a04e7104fea9a605ae4 Apr 16 18:52:46.131161 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:46.131119 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" event={"ID":"107a8f2e-ba63-42a2-94b3-c48d52b96cb3","Type":"ContainerStarted","Data":"470258ca8b5e9a8c44bee62241afaf2f2640d643aab5e759e39c1a97a08ac47a"} Apr 16 18:52:46.131161 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:46.131153 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" event={"ID":"107a8f2e-ba63-42a2-94b3-c48d52b96cb3","Type":"ContainerStarted","Data":"f40bd958b99e41fdd11a967d0234f64cba248452a38a7a04e7104fea9a605ae4"} Apr 16 18:52:49.143922 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:49.143898 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2e895d-predictor-6b4f9f87-2cczm_107a8f2e-ba63-42a2-94b3-c48d52b96cb3/storage-initializer/0.log" Apr 16 18:52:49.144284 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:49.143935 2577 generic.go:358] "Generic (PLEG): container finished" podID="107a8f2e-ba63-42a2-94b3-c48d52b96cb3" containerID="470258ca8b5e9a8c44bee62241afaf2f2640d643aab5e759e39c1a97a08ac47a" exitCode=1 Apr 16 18:52:49.144284 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:49.143961 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" event={"ID":"107a8f2e-ba63-42a2-94b3-c48d52b96cb3","Type":"ContainerDied","Data":"470258ca8b5e9a8c44bee62241afaf2f2640d643aab5e759e39c1a97a08ac47a"} Apr 16 18:52:50.149262 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:50.149232 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2e895d-predictor-6b4f9f87-2cczm_107a8f2e-ba63-42a2-94b3-c48d52b96cb3/storage-initializer/0.log" Apr 16 18:52:50.149642 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:50.149309 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" event={"ID":"107a8f2e-ba63-42a2-94b3-c48d52b96cb3","Type":"ContainerStarted","Data":"a9dff622391e8916c6ae8ffc33254c4f76cd202a343ef526c712105636fb0fb8"} Apr 16 18:52:53.159583 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:53.159554 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2e895d-predictor-6b4f9f87-2cczm_107a8f2e-ba63-42a2-94b3-c48d52b96cb3/storage-initializer/1.log" Apr 16 18:52:53.160056 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:53.159933 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2e895d-predictor-6b4f9f87-2cczm_107a8f2e-ba63-42a2-94b3-c48d52b96cb3/storage-initializer/0.log" Apr 16 18:52:53.160056 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:53.159966 2577 generic.go:358] "Generic (PLEG): container finished" podID="107a8f2e-ba63-42a2-94b3-c48d52b96cb3" containerID="a9dff622391e8916c6ae8ffc33254c4f76cd202a343ef526c712105636fb0fb8" exitCode=1 Apr 16 18:52:53.160056 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:53.160014 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" event={"ID":"107a8f2e-ba63-42a2-94b3-c48d52b96cb3","Type":"ContainerDied","Data":"a9dff622391e8916c6ae8ffc33254c4f76cd202a343ef526c712105636fb0fb8"} Apr 16 18:52:53.160056 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:53.160046 2577 scope.go:117] "RemoveContainer" containerID="470258ca8b5e9a8c44bee62241afaf2f2640d643aab5e759e39c1a97a08ac47a" Apr 16 18:52:53.160413 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:53.160395 2577 scope.go:117] "RemoveContainer" containerID="470258ca8b5e9a8c44bee62241afaf2f2640d643aab5e759e39c1a97a08ac47a" Apr 16 18:52:53.170863 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:52:53.170831 2577 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-2e895d-predictor-6b4f9f87-2cczm_kserve-ci-e2e-test_107a8f2e-ba63-42a2-94b3-c48d52b96cb3_0 in pod sandbox f40bd958b99e41fdd11a967d0234f64cba248452a38a7a04e7104fea9a605ae4 from index: no such id: '470258ca8b5e9a8c44bee62241afaf2f2640d643aab5e759e39c1a97a08ac47a'" containerID="470258ca8b5e9a8c44bee62241afaf2f2640d643aab5e759e39c1a97a08ac47a" Apr 16 18:52:53.170944 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:52:53.170891 2577 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-2e895d-predictor-6b4f9f87-2cczm_kserve-ci-e2e-test_107a8f2e-ba63-42a2-94b3-c48d52b96cb3_0 in pod sandbox f40bd958b99e41fdd11a967d0234f64cba248452a38a7a04e7104fea9a605ae4 from index: no such id: '470258ca8b5e9a8c44bee62241afaf2f2640d643aab5e759e39c1a97a08ac47a'; Skipping pod \"isvc-secondary-2e895d-predictor-6b4f9f87-2cczm_kserve-ci-e2e-test(107a8f2e-ba63-42a2-94b3-c48d52b96cb3)\"" logger="UnhandledError" Apr 16 18:52:53.172240 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:52:53.172221 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-2e895d-predictor-6b4f9f87-2cczm_kserve-ci-e2e-test(107a8f2e-ba63-42a2-94b3-c48d52b96cb3)\"" pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" podUID="107a8f2e-ba63-42a2-94b3-c48d52b96cb3" Apr 16 18:52:54.164466 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:52:54.164440 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2e895d-predictor-6b4f9f87-2cczm_107a8f2e-ba63-42a2-94b3-c48d52b96cb3/storage-initializer/1.log" Apr 16 18:53:00.877870 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:00.877838 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm"] Apr 16 18:53:00.931735 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:00.931667 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8"] Apr 16 18:53:00.932096 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:00.932046 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="kserve-container" containerID="cri-o://ca763b51078d6e06810fb231fc22fe4b32bd8200e0a3e369889942cb9359ca15" gracePeriod=30 Apr 16 18:53:00.975427 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:00.975397 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw"] Apr 16 18:53:00.978824 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:00.978802 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" Apr 16 18:53:00.981144 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:00.981122 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-7ec1f4\"" Apr 16 18:53:00.981266 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:00.981193 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-7ec1f4-dockercfg-kt6fh\"" Apr 16 18:53:00.987274 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:00.987245 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw"] Apr 16 18:53:01.023004 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.022985 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2e895d-predictor-6b4f9f87-2cczm_107a8f2e-ba63-42a2-94b3-c48d52b96cb3/storage-initializer/1.log" Apr 16 18:53:01.023098 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.023045 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" Apr 16 18:53:01.073292 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.073257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b75b79a9-b080-43fd-96ca-1452dc1b5bd1-kserve-provision-location\") pod \"isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw\" (UID: \"b75b79a9-b080-43fd-96ca-1452dc1b5bd1\") " pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" Apr 16 18:53:01.073436 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.073344 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b75b79a9-b080-43fd-96ca-1452dc1b5bd1-cabundle-cert\") pod \"isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw\" (UID: \"b75b79a9-b080-43fd-96ca-1452dc1b5bd1\") " pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" Apr 16 18:53:01.173834 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.173726 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/107a8f2e-ba63-42a2-94b3-c48d52b96cb3-cabundle-cert\") pod \"107a8f2e-ba63-42a2-94b3-c48d52b96cb3\" (UID: \"107a8f2e-ba63-42a2-94b3-c48d52b96cb3\") " Apr 16 18:53:01.173961 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.173824 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/107a8f2e-ba63-42a2-94b3-c48d52b96cb3-kserve-provision-location\") pod \"107a8f2e-ba63-42a2-94b3-c48d52b96cb3\" (UID: \"107a8f2e-ba63-42a2-94b3-c48d52b96cb3\") " Apr 16 18:53:01.174004 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.173968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b75b79a9-b080-43fd-96ca-1452dc1b5bd1-cabundle-cert\") pod \"isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw\" (UID: \"b75b79a9-b080-43fd-96ca-1452dc1b5bd1\") " pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" Apr 16 18:53:01.174082 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.174058 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107a8f2e-ba63-42a2-94b3-c48d52b96cb3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "107a8f2e-ba63-42a2-94b3-c48d52b96cb3" (UID: "107a8f2e-ba63-42a2-94b3-c48d52b96cb3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:53:01.174115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.174067 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107a8f2e-ba63-42a2-94b3-c48d52b96cb3-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "107a8f2e-ba63-42a2-94b3-c48d52b96cb3" (UID: "107a8f2e-ba63-42a2-94b3-c48d52b96cb3"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:53:01.174115 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.174064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b75b79a9-b080-43fd-96ca-1452dc1b5bd1-kserve-provision-location\") pod \"isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw\" (UID: \"b75b79a9-b080-43fd-96ca-1452dc1b5bd1\") " pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" Apr 16 18:53:01.174184 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.174172 2577 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/107a8f2e-ba63-42a2-94b3-c48d52b96cb3-cabundle-cert\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:53:01.174220 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.174190 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/107a8f2e-ba63-42a2-94b3-c48d52b96cb3-kserve-provision-location\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:53:01.174430 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.174411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b75b79a9-b080-43fd-96ca-1452dc1b5bd1-kserve-provision-location\") pod \"isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw\" (UID: \"b75b79a9-b080-43fd-96ca-1452dc1b5bd1\") " pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" Apr 16 18:53:01.174676 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.174660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b75b79a9-b080-43fd-96ca-1452dc1b5bd1-cabundle-cert\") pod \"isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw\" (UID: \"b75b79a9-b080-43fd-96ca-1452dc1b5bd1\") " pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" Apr 16 18:53:01.189812 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.189795 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-2e895d-predictor-6b4f9f87-2cczm_107a8f2e-ba63-42a2-94b3-c48d52b96cb3/storage-initializer/1.log" Apr 16 18:53:01.189940 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.189877 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" event={"ID":"107a8f2e-ba63-42a2-94b3-c48d52b96cb3","Type":"ContainerDied","Data":"f40bd958b99e41fdd11a967d0234f64cba248452a38a7a04e7104fea9a605ae4"} Apr 16 18:53:01.189940 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.189908 2577 scope.go:117] "RemoveContainer" containerID="a9dff622391e8916c6ae8ffc33254c4f76cd202a343ef526c712105636fb0fb8" Apr 16 18:53:01.190007 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.189908 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm" Apr 16 18:53:01.222164 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.222140 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm"] Apr 16 18:53:01.226574 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.226548 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-2e895d-predictor-6b4f9f87-2cczm"] Apr 16 18:53:01.290459 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.290430 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" Apr 16 18:53:01.407551 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.407455 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw"] Apr 16 18:53:01.409892 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:53:01.409863 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb75b79a9_b080_43fd_96ca_1452dc1b5bd1.slice/crio-629287f25c54811f1d6544e9af371b9eed76a2277bc172ef8d7a573f69ff5d8d WatchSource:0}: Error finding container 629287f25c54811f1d6544e9af371b9eed76a2277bc172ef8d7a573f69ff5d8d: Status 404 returned error can't find the container with id 629287f25c54811f1d6544e9af371b9eed76a2277bc172ef8d7a573f69ff5d8d Apr 16 18:53:01.880294 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:01.880249 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:53:02.194885 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:02.194779 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" event={"ID":"b75b79a9-b080-43fd-96ca-1452dc1b5bd1","Type":"ContainerStarted","Data":"6567ae2fa425c326c66155138ad0b0bb357a6d9e139c93aa80b442b17c71ee64"} Apr 16 18:53:02.194885 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:02.194822 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" event={"ID":"b75b79a9-b080-43fd-96ca-1452dc1b5bd1","Type":"ContainerStarted","Data":"629287f25c54811f1d6544e9af371b9eed76a2277bc172ef8d7a573f69ff5d8d"} Apr 16 18:53:02.617859 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:02.617815 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107a8f2e-ba63-42a2-94b3-c48d52b96cb3" path="/var/lib/kubelet/pods/107a8f2e-ba63-42a2-94b3-c48d52b96cb3/volumes" Apr 16 18:53:04.203699 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:04.203670 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw_b75b79a9-b080-43fd-96ca-1452dc1b5bd1/storage-initializer/0.log" Apr 16 18:53:04.204095 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:04.203711 2577 generic.go:358] "Generic (PLEG): container finished" podID="b75b79a9-b080-43fd-96ca-1452dc1b5bd1" containerID="6567ae2fa425c326c66155138ad0b0bb357a6d9e139c93aa80b442b17c71ee64" exitCode=1 Apr 16 18:53:04.204095 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:04.203794 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" event={"ID":"b75b79a9-b080-43fd-96ca-1452dc1b5bd1","Type":"ContainerDied","Data":"6567ae2fa425c326c66155138ad0b0bb357a6d9e139c93aa80b442b17c71ee64"} Apr 16 18:53:05.209372 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:05.209344 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw_b75b79a9-b080-43fd-96ca-1452dc1b5bd1/storage-initializer/0.log" Apr 16 18:53:05.209740 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:05.209443 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" event={"ID":"b75b79a9-b080-43fd-96ca-1452dc1b5bd1","Type":"ContainerStarted","Data":"779bda5c4d5c73b721bbe192d42cf327df500adc7f9a6bf1d0ac61ce8d8ff8b3"} Apr 16 18:53:05.377567 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:05.377547 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" Apr 16 18:53:05.514563 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:05.514488 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da0daa06-f6df-4f51-95c7-98d163d34200-kserve-provision-location\") pod \"da0daa06-f6df-4f51-95c7-98d163d34200\" (UID: \"da0daa06-f6df-4f51-95c7-98d163d34200\") " Apr 16 18:53:05.514826 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:05.514803 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da0daa06-f6df-4f51-95c7-98d163d34200-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "da0daa06-f6df-4f51-95c7-98d163d34200" (UID: "da0daa06-f6df-4f51-95c7-98d163d34200"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:53:05.616082 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:05.616050 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da0daa06-f6df-4f51-95c7-98d163d34200-kserve-provision-location\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:53:05.989375 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:05.989335 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw"] Apr 16 18:53:06.125578 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.125546 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x"] Apr 16 18:53:06.125929 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.125915 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="storage-initializer" Apr 16 18:53:06.125929 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.125930 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="storage-initializer" Apr 16 18:53:06.126025 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.125942 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="107a8f2e-ba63-42a2-94b3-c48d52b96cb3" containerName="storage-initializer" Apr 16 18:53:06.126025 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.125948 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="107a8f2e-ba63-42a2-94b3-c48d52b96cb3" containerName="storage-initializer" Apr 16 18:53:06.126025 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.125959 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="kserve-container" Apr 16 18:53:06.126025 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.125965 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="kserve-container" Apr 16 18:53:06.126025 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.125998 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="107a8f2e-ba63-42a2-94b3-c48d52b96cb3" containerName="storage-initializer" Apr 16 18:53:06.126025 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.126004 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="107a8f2e-ba63-42a2-94b3-c48d52b96cb3" containerName="storage-initializer" Apr 16 18:53:06.126217 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.126059 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="107a8f2e-ba63-42a2-94b3-c48d52b96cb3" containerName="storage-initializer" Apr 16 18:53:06.126217 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.126068 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" containerName="kserve-container" Apr 16 18:53:06.126217 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.126165 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="107a8f2e-ba63-42a2-94b3-c48d52b96cb3" containerName="storage-initializer" Apr 16 18:53:06.129204 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.129187 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" Apr 16 18:53:06.139499 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.139471 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x"] Apr 16 18:53:06.215802 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.215743 2577 generic.go:358] "Generic (PLEG): container finished" podID="da0daa06-f6df-4f51-95c7-98d163d34200" containerID="ca763b51078d6e06810fb231fc22fe4b32bd8200e0a3e369889942cb9359ca15" exitCode=0 Apr 16 18:53:06.216193 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.215803 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" event={"ID":"da0daa06-f6df-4f51-95c7-98d163d34200","Type":"ContainerDied","Data":"ca763b51078d6e06810fb231fc22fe4b32bd8200e0a3e369889942cb9359ca15"} Apr 16 18:53:06.216193 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.215825 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" Apr 16 18:53:06.216193 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.215839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8" event={"ID":"da0daa06-f6df-4f51-95c7-98d163d34200","Type":"ContainerDied","Data":"c0c0c6b406a31ed84d04573a53d0b8e7bdc18b62d62a14fe5091f59da8ece15a"} Apr 16 18:53:06.216193 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.215855 2577 scope.go:117] "RemoveContainer" containerID="ca763b51078d6e06810fb231fc22fe4b32bd8200e0a3e369889942cb9359ca15" Apr 16 18:53:06.216401 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.216203 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" podUID="b75b79a9-b080-43fd-96ca-1452dc1b5bd1" containerName="storage-initializer" containerID="cri-o://779bda5c4d5c73b721bbe192d42cf327df500adc7f9a6bf1d0ac61ce8d8ff8b3" gracePeriod=30 Apr 16 18:53:06.220700 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.220677 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dfe33f4-89ad-46f9-9a1b-77218902c940-kserve-provision-location\") pod \"raw-sklearn-dd577-predictor-7758bff698-bxx9x\" (UID: \"7dfe33f4-89ad-46f9-9a1b-77218902c940\") " pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" Apr 16 18:53:06.224174 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.224152 2577 scope.go:117] "RemoveContainer" containerID="3f4ebcd5985c3e3236406c715756ea3e45843fc95fcf19fa657470528740071d" Apr 16 18:53:06.231412 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.231390 2577 scope.go:117] "RemoveContainer" containerID="ca763b51078d6e06810fb231fc22fe4b32bd8200e0a3e369889942cb9359ca15" Apr 16 18:53:06.231636 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:53:06.231617 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca763b51078d6e06810fb231fc22fe4b32bd8200e0a3e369889942cb9359ca15\": container with ID starting with ca763b51078d6e06810fb231fc22fe4b32bd8200e0a3e369889942cb9359ca15 not found: ID does not exist" containerID="ca763b51078d6e06810fb231fc22fe4b32bd8200e0a3e369889942cb9359ca15" Apr 16 18:53:06.231694 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.231641 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca763b51078d6e06810fb231fc22fe4b32bd8200e0a3e369889942cb9359ca15"} err="failed to get container status \"ca763b51078d6e06810fb231fc22fe4b32bd8200e0a3e369889942cb9359ca15\": rpc error: code = NotFound desc = could not find container \"ca763b51078d6e06810fb231fc22fe4b32bd8200e0a3e369889942cb9359ca15\": container with ID starting with ca763b51078d6e06810fb231fc22fe4b32bd8200e0a3e369889942cb9359ca15 not found: ID does not exist" Apr 16 18:53:06.231694 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.231657 2577 scope.go:117] "RemoveContainer" containerID="3f4ebcd5985c3e3236406c715756ea3e45843fc95fcf19fa657470528740071d" Apr 16 18:53:06.231927 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:53:06.231903 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4ebcd5985c3e3236406c715756ea3e45843fc95fcf19fa657470528740071d\": container with ID starting with 3f4ebcd5985c3e3236406c715756ea3e45843fc95fcf19fa657470528740071d not found: ID does not exist" containerID="3f4ebcd5985c3e3236406c715756ea3e45843fc95fcf19fa657470528740071d" Apr 16 18:53:06.232014 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.231933 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4ebcd5985c3e3236406c715756ea3e45843fc95fcf19fa657470528740071d"} err="failed to get container status \"3f4ebcd5985c3e3236406c715756ea3e45843fc95fcf19fa657470528740071d\": rpc error: code = NotFound desc = could not find container \"3f4ebcd5985c3e3236406c715756ea3e45843fc95fcf19fa657470528740071d\": container with ID starting with 3f4ebcd5985c3e3236406c715756ea3e45843fc95fcf19fa657470528740071d not found: ID does not exist" Apr 16 18:53:06.236570 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.236550 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8"] Apr 16 18:53:06.241794 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.241723 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-2e895d-predictor-6b8ff99d46-ghzg8"] Apr 16 18:53:06.322155 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.322125 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dfe33f4-89ad-46f9-9a1b-77218902c940-kserve-provision-location\") pod \"raw-sklearn-dd577-predictor-7758bff698-bxx9x\" (UID: \"7dfe33f4-89ad-46f9-9a1b-77218902c940\") " pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" Apr 16 18:53:06.322497 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.322469 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dfe33f4-89ad-46f9-9a1b-77218902c940-kserve-provision-location\") pod \"raw-sklearn-dd577-predictor-7758bff698-bxx9x\" (UID: \"7dfe33f4-89ad-46f9-9a1b-77218902c940\") " pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" Apr 16 18:53:06.440836 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.440799 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" Apr 16 18:53:06.564585 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.564479 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x"] Apr 16 18:53:06.566702 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:53:06.566672 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dfe33f4_89ad_46f9_9a1b_77218902c940.slice/crio-f95b3a93e95475017f18e291c732586499452fc50b5f6dd13c3b2fbc072b52e3 WatchSource:0}: Error finding container f95b3a93e95475017f18e291c732586499452fc50b5f6dd13c3b2fbc072b52e3: Status 404 returned error can't find the container with id f95b3a93e95475017f18e291c732586499452fc50b5f6dd13c3b2fbc072b52e3 Apr 16 18:53:06.618460 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:06.618435 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da0daa06-f6df-4f51-95c7-98d163d34200" path="/var/lib/kubelet/pods/da0daa06-f6df-4f51-95c7-98d163d34200/volumes" Apr 16 18:53:07.221268 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:07.221234 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" event={"ID":"7dfe33f4-89ad-46f9-9a1b-77218902c940","Type":"ContainerStarted","Data":"78751763408c3ce537c8065445eacf4e552e7c7b6f2823ed631b6fb6ad5e3997"} Apr 16 18:53:07.221268 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:07.221270 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" event={"ID":"7dfe33f4-89ad-46f9-9a1b-77218902c940","Type":"ContainerStarted","Data":"f95b3a93e95475017f18e291c732586499452fc50b5f6dd13c3b2fbc072b52e3"} Apr 16 18:53:11.235682 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.235653 2577 generic.go:358] "Generic (PLEG): container finished" podID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerID="78751763408c3ce537c8065445eacf4e552e7c7b6f2823ed631b6fb6ad5e3997" exitCode=0 Apr 16 18:53:11.236032 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.235721 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" event={"ID":"7dfe33f4-89ad-46f9-9a1b-77218902c940","Type":"ContainerDied","Data":"78751763408c3ce537c8065445eacf4e552e7c7b6f2823ed631b6fb6ad5e3997"} Apr 16 18:53:11.237396 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.237378 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw_b75b79a9-b080-43fd-96ca-1452dc1b5bd1/storage-initializer/1.log" Apr 16 18:53:11.237696 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.237683 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw_b75b79a9-b080-43fd-96ca-1452dc1b5bd1/storage-initializer/0.log" Apr 16 18:53:11.237809 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.237713 2577 generic.go:358] "Generic (PLEG): container finished" podID="b75b79a9-b080-43fd-96ca-1452dc1b5bd1" containerID="779bda5c4d5c73b721bbe192d42cf327df500adc7f9a6bf1d0ac61ce8d8ff8b3" exitCode=1 Apr 16 18:53:11.237809 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.237755 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" event={"ID":"b75b79a9-b080-43fd-96ca-1452dc1b5bd1","Type":"ContainerDied","Data":"779bda5c4d5c73b721bbe192d42cf327df500adc7f9a6bf1d0ac61ce8d8ff8b3"} Apr 16 18:53:11.237942 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.237880 2577 scope.go:117] "RemoveContainer" containerID="6567ae2fa425c326c66155138ad0b0bb357a6d9e139c93aa80b442b17c71ee64" Apr 16 18:53:11.371879 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.371856 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw_b75b79a9-b080-43fd-96ca-1452dc1b5bd1/storage-initializer/1.log" Apr 16 18:53:11.372001 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.371924 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" Apr 16 18:53:11.466861 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.466755 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b75b79a9-b080-43fd-96ca-1452dc1b5bd1-cabundle-cert\") pod \"b75b79a9-b080-43fd-96ca-1452dc1b5bd1\" (UID: \"b75b79a9-b080-43fd-96ca-1452dc1b5bd1\") " Apr 16 18:53:11.466861 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.466822 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b75b79a9-b080-43fd-96ca-1452dc1b5bd1-kserve-provision-location\") pod \"b75b79a9-b080-43fd-96ca-1452dc1b5bd1\" (UID: \"b75b79a9-b080-43fd-96ca-1452dc1b5bd1\") " Apr 16 18:53:11.467100 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.467072 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b75b79a9-b080-43fd-96ca-1452dc1b5bd1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b75b79a9-b080-43fd-96ca-1452dc1b5bd1" (UID: "b75b79a9-b080-43fd-96ca-1452dc1b5bd1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:53:11.467139 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.467125 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b75b79a9-b080-43fd-96ca-1452dc1b5bd1-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "b75b79a9-b080-43fd-96ca-1452dc1b5bd1" (UID: "b75b79a9-b080-43fd-96ca-1452dc1b5bd1"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:53:11.568119 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.568079 2577 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b75b79a9-b080-43fd-96ca-1452dc1b5bd1-cabundle-cert\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:53:11.568119 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:11.568115 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b75b79a9-b080-43fd-96ca-1452dc1b5bd1-kserve-provision-location\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:53:12.242144 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:12.242114 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" event={"ID":"7dfe33f4-89ad-46f9-9a1b-77218902c940","Type":"ContainerStarted","Data":"554f77f25e31ef46167950e7ac0d245ff3f87ffb759537095a52507cb9232caa"} Apr 16 18:53:12.242560 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:12.242408 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" Apr 16 18:53:12.243488 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:12.243469 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw_b75b79a9-b080-43fd-96ca-1452dc1b5bd1/storage-initializer/1.log" Apr 16 18:53:12.243612 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:12.243515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" event={"ID":"b75b79a9-b080-43fd-96ca-1452dc1b5bd1","Type":"ContainerDied","Data":"629287f25c54811f1d6544e9af371b9eed76a2277bc172ef8d7a573f69ff5d8d"} Apr 16 18:53:12.243612 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:12.243549 2577 scope.go:117] "RemoveContainer" containerID="779bda5c4d5c73b721bbe192d42cf327df500adc7f9a6bf1d0ac61ce8d8ff8b3" Apr 16 18:53:12.243612 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:12.243575 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw" Apr 16 18:53:12.244225 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:12.244193 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:53:12.258443 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:12.258401 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" podStartSLOduration=6.258389026 podStartE2EDuration="6.258389026s" podCreationTimestamp="2026-04-16 18:53:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:53:12.256153467 +0000 UTC m=+1354.201584202" watchObservedRunningTime="2026-04-16 18:53:12.258389026 +0000 UTC m=+1354.203819736" Apr 16 18:53:12.278929 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:12.278903 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw"] Apr 16 18:53:12.282395 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:12.282362 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-7ec1f4-predictor-6584dc5565-pvggw"] Apr 16 18:53:12.618121 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:12.618081 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b75b79a9-b080-43fd-96ca-1452dc1b5bd1" path="/var/lib/kubelet/pods/b75b79a9-b080-43fd-96ca-1452dc1b5bd1/volumes" Apr 16 18:53:13.248042 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:13.248009 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:53:23.248988 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:23.248943 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:53:33.248886 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:33.248845 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:53:43.248584 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:43.248541 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:53:53.248160 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:53:53.248118 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:54:03.248355 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:03.248315 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:54:13.248120 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:13.248077 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:54:23.250194 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:23.250113 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" Apr 16 18:54:26.230304 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.230266 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x"] Apr 16 18:54:26.230760 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.230634 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="kserve-container" containerID="cri-o://554f77f25e31ef46167950e7ac0d245ff3f87ffb759537095a52507cb9232caa" gracePeriod=30 Apr 16 18:54:26.346924 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.346888 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8"] Apr 16 18:54:26.347305 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.347285 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b75b79a9-b080-43fd-96ca-1452dc1b5bd1" containerName="storage-initializer" Apr 16 18:54:26.347305 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.347304 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75b79a9-b080-43fd-96ca-1452dc1b5bd1" containerName="storage-initializer" Apr 16 18:54:26.347464 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.347323 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b75b79a9-b080-43fd-96ca-1452dc1b5bd1" containerName="storage-initializer" Apr 16 18:54:26.347464 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.347329 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75b79a9-b080-43fd-96ca-1452dc1b5bd1" containerName="storage-initializer" Apr 16 18:54:26.347464 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.347392 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b75b79a9-b080-43fd-96ca-1452dc1b5bd1" containerName="storage-initializer" Apr 16 18:54:26.347561 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.347522 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b75b79a9-b080-43fd-96ca-1452dc1b5bd1" containerName="storage-initializer" Apr 16 18:54:26.350726 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.350704 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" Apr 16 18:54:26.359978 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.359949 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8"] Apr 16 18:54:26.409931 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.409900 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4a471bc-ffc4-457c-a68f-fc06acb635be-kserve-provision-location\") pod \"raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8\" (UID: \"b4a471bc-ffc4-457c-a68f-fc06acb635be\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" Apr 16 18:54:26.511460 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.511371 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4a471bc-ffc4-457c-a68f-fc06acb635be-kserve-provision-location\") pod \"raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8\" (UID: \"b4a471bc-ffc4-457c-a68f-fc06acb635be\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" Apr 16 18:54:26.511817 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.511797 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4a471bc-ffc4-457c-a68f-fc06acb635be-kserve-provision-location\") pod \"raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8\" (UID: \"b4a471bc-ffc4-457c-a68f-fc06acb635be\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" Apr 16 18:54:26.661677 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.661632 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" Apr 16 18:54:26.783004 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.782978 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8"] Apr 16 18:54:26.784951 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:54:26.784924 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4a471bc_ffc4_457c_a68f_fc06acb635be.slice/crio-4a1cd088de1ceb6289770ef79f5d8e686b9cc55985da8dc2e6dcef20bc9e3994 WatchSource:0}: Error finding container 4a1cd088de1ceb6289770ef79f5d8e686b9cc55985da8dc2e6dcef20bc9e3994: Status 404 returned error can't find the container with id 4a1cd088de1ceb6289770ef79f5d8e686b9cc55985da8dc2e6dcef20bc9e3994 Apr 16 18:54:26.786844 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:26.786826 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:54:27.503455 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:27.503420 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" event={"ID":"b4a471bc-ffc4-457c-a68f-fc06acb635be","Type":"ContainerStarted","Data":"28d519da88a9bf4b4ce1135fef8ae1dda96789f6830a370d6686b123bff56a15"} Apr 16 18:54:27.503455 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:27.503453 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" event={"ID":"b4a471bc-ffc4-457c-a68f-fc06acb635be","Type":"ContainerStarted","Data":"4a1cd088de1ceb6289770ef79f5d8e686b9cc55985da8dc2e6dcef20bc9e3994"} Apr 16 18:54:30.516026 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:30.515989 2577 generic.go:358] "Generic (PLEG): container finished" podID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerID="554f77f25e31ef46167950e7ac0d245ff3f87ffb759537095a52507cb9232caa" exitCode=0 Apr 16 18:54:30.516373 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:30.516055 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" event={"ID":"7dfe33f4-89ad-46f9-9a1b-77218902c940","Type":"ContainerDied","Data":"554f77f25e31ef46167950e7ac0d245ff3f87ffb759537095a52507cb9232caa"} Apr 16 18:54:30.591034 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:30.591009 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" Apr 16 18:54:30.649129 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:30.649100 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dfe33f4-89ad-46f9-9a1b-77218902c940-kserve-provision-location\") pod \"7dfe33f4-89ad-46f9-9a1b-77218902c940\" (UID: \"7dfe33f4-89ad-46f9-9a1b-77218902c940\") " Apr 16 18:54:30.649428 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:30.649400 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dfe33f4-89ad-46f9-9a1b-77218902c940-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7dfe33f4-89ad-46f9-9a1b-77218902c940" (UID: "7dfe33f4-89ad-46f9-9a1b-77218902c940"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:54:30.750623 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:30.750587 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dfe33f4-89ad-46f9-9a1b-77218902c940-kserve-provision-location\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:54:31.526645 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:31.526601 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" event={"ID":"7dfe33f4-89ad-46f9-9a1b-77218902c940","Type":"ContainerDied","Data":"f95b3a93e95475017f18e291c732586499452fc50b5f6dd13c3b2fbc072b52e3"} Apr 16 18:54:31.526645 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:31.526624 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x" Apr 16 18:54:31.527160 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:31.526655 2577 scope.go:117] "RemoveContainer" containerID="554f77f25e31ef46167950e7ac0d245ff3f87ffb759537095a52507cb9232caa" Apr 16 18:54:31.528122 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:31.528101 2577 generic.go:358] "Generic (PLEG): container finished" podID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerID="28d519da88a9bf4b4ce1135fef8ae1dda96789f6830a370d6686b123bff56a15" exitCode=0 Apr 16 18:54:31.528208 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:31.528191 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" event={"ID":"b4a471bc-ffc4-457c-a68f-fc06acb635be","Type":"ContainerDied","Data":"28d519da88a9bf4b4ce1135fef8ae1dda96789f6830a370d6686b123bff56a15"} Apr 16 18:54:31.535584 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:31.535181 2577 scope.go:117] "RemoveContainer" containerID="78751763408c3ce537c8065445eacf4e552e7c7b6f2823ed631b6fb6ad5e3997" Apr 16 18:54:31.557281 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:31.557253 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x"] Apr 16 18:54:31.558912 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:31.558891 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-dd577-predictor-7758bff698-bxx9x"] Apr 16 18:54:32.534578 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:32.534543 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" event={"ID":"b4a471bc-ffc4-457c-a68f-fc06acb635be","Type":"ContainerStarted","Data":"207b9ae4a59d499afbfa28bfec97f3a31aee860bbcd8520a370fc846888ac99f"} Apr 16 18:54:32.535012 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:32.534879 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" Apr 16 18:54:32.536274 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:32.536248 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:54:32.549578 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:32.549537 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" podStartSLOduration=6.549523917 podStartE2EDuration="6.549523917s" podCreationTimestamp="2026-04-16 18:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:54:32.548797929 +0000 UTC m=+1434.494228640" watchObservedRunningTime="2026-04-16 18:54:32.549523917 +0000 UTC m=+1434.494954626" Apr 16 18:54:32.620071 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:32.620036 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" path="/var/lib/kubelet/pods/7dfe33f4-89ad-46f9-9a1b-77218902c940/volumes" Apr 16 18:54:33.538648 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:33.538565 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:54:43.538931 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:43.538887 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:54:53.538859 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:54:53.538815 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:55:03.538841 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:03.538762 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:55:13.539103 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:13.539059 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:55:23.539327 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:23.539279 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:55:33.538914 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:33.538864 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 18:55:38.614552 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:38.614517 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:55:38.617257 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:38.617232 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:55:38.618860 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:38.618841 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:55:38.621215 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:38.621196 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:55:43.539686 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:43.539651 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" Apr 16 18:55:46.423278 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:46.423235 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8"] Apr 16 18:55:46.423780 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:46.423707 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="kserve-container" containerID="cri-o://207b9ae4a59d499afbfa28bfec97f3a31aee860bbcd8520a370fc846888ac99f" gracePeriod=30 Apr 16 18:55:50.569535 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.569513 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" Apr 16 18:55:50.747842 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.747748 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4a471bc-ffc4-457c-a68f-fc06acb635be-kserve-provision-location\") pod \"b4a471bc-ffc4-457c-a68f-fc06acb635be\" (UID: \"b4a471bc-ffc4-457c-a68f-fc06acb635be\") " Apr 16 18:55:50.748039 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.748017 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4a471bc-ffc4-457c-a68f-fc06acb635be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b4a471bc-ffc4-457c-a68f-fc06acb635be" (UID: "b4a471bc-ffc4-457c-a68f-fc06acb635be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:50.796733 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.796707 2577 generic.go:358] "Generic (PLEG): container finished" podID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerID="207b9ae4a59d499afbfa28bfec97f3a31aee860bbcd8520a370fc846888ac99f" exitCode=0 Apr 16 18:55:50.796871 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.796794 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" Apr 16 18:55:50.796871 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.796795 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" event={"ID":"b4a471bc-ffc4-457c-a68f-fc06acb635be","Type":"ContainerDied","Data":"207b9ae4a59d499afbfa28bfec97f3a31aee860bbcd8520a370fc846888ac99f"} Apr 16 18:55:50.796950 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.796893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8" event={"ID":"b4a471bc-ffc4-457c-a68f-fc06acb635be","Type":"ContainerDied","Data":"4a1cd088de1ceb6289770ef79f5d8e686b9cc55985da8dc2e6dcef20bc9e3994"} Apr 16 18:55:50.796950 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.796910 2577 scope.go:117] "RemoveContainer" containerID="207b9ae4a59d499afbfa28bfec97f3a31aee860bbcd8520a370fc846888ac99f" Apr 16 18:55:50.805229 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.805209 2577 scope.go:117] "RemoveContainer" containerID="28d519da88a9bf4b4ce1135fef8ae1dda96789f6830a370d6686b123bff56a15" Apr 16 18:55:50.812445 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.812432 2577 scope.go:117] "RemoveContainer" containerID="207b9ae4a59d499afbfa28bfec97f3a31aee860bbcd8520a370fc846888ac99f" Apr 16 18:55:50.812689 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:55:50.812671 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"207b9ae4a59d499afbfa28bfec97f3a31aee860bbcd8520a370fc846888ac99f\": container with ID starting with 207b9ae4a59d499afbfa28bfec97f3a31aee860bbcd8520a370fc846888ac99f not found: ID does not exist" containerID="207b9ae4a59d499afbfa28bfec97f3a31aee860bbcd8520a370fc846888ac99f" Apr 16 18:55:50.812750 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.812696 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207b9ae4a59d499afbfa28bfec97f3a31aee860bbcd8520a370fc846888ac99f"} err="failed to get container status \"207b9ae4a59d499afbfa28bfec97f3a31aee860bbcd8520a370fc846888ac99f\": rpc error: code = NotFound desc = could not find container \"207b9ae4a59d499afbfa28bfec97f3a31aee860bbcd8520a370fc846888ac99f\": container with ID starting with 207b9ae4a59d499afbfa28bfec97f3a31aee860bbcd8520a370fc846888ac99f not found: ID does not exist" Apr 16 18:55:50.812750 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.812714 2577 scope.go:117] "RemoveContainer" containerID="28d519da88a9bf4b4ce1135fef8ae1dda96789f6830a370d6686b123bff56a15" Apr 16 18:55:50.812953 ip-10-0-140-1 kubenswrapper[2577]: E0416 18:55:50.812935 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d519da88a9bf4b4ce1135fef8ae1dda96789f6830a370d6686b123bff56a15\": container with ID starting with 28d519da88a9bf4b4ce1135fef8ae1dda96789f6830a370d6686b123bff56a15 not found: ID does not exist" containerID="28d519da88a9bf4b4ce1135fef8ae1dda96789f6830a370d6686b123bff56a15" Apr 16 18:55:50.812993 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.812960 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d519da88a9bf4b4ce1135fef8ae1dda96789f6830a370d6686b123bff56a15"} err="failed to get container status \"28d519da88a9bf4b4ce1135fef8ae1dda96789f6830a370d6686b123bff56a15\": rpc error: code = NotFound desc = could not find container \"28d519da88a9bf4b4ce1135fef8ae1dda96789f6830a370d6686b123bff56a15\": container with ID starting with 28d519da88a9bf4b4ce1135fef8ae1dda96789f6830a370d6686b123bff56a15 not found: ID does not exist" Apr 16 18:55:50.816962 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.816940 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8"] Apr 16 18:55:50.821195 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.821176 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-cef54-predictor-9df4fddf-kgpx8"] Apr 16 18:55:50.848398 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:50.848374 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4a471bc-ffc4-457c-a68f-fc06acb635be-kserve-provision-location\") on node \"ip-10-0-140-1.ec2.internal\" DevicePath \"\"" Apr 16 18:55:52.617906 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:55:52.617826 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" path="/var/lib/kubelet/pods/b4a471bc-ffc4-457c-a68f-fc06acb635be/volumes" Apr 16 18:56:13.585174 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:13.585150 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hzbsx_ed188e37-3c6c-4aa4-9451-3d99128b9dec/global-pull-secret-syncer/0.log" Apr 16 18:56:13.703599 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:13.703572 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mw7zk_85759909-428a-4c11-95e0-96f51d6580f6/konnectivity-agent/0.log" Apr 16 18:56:13.748727 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:13.748699 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-1.ec2.internal_d652678662059d52536902d6dffe6ef4/haproxy/0.log" Apr 16 18:56:16.832218 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:16.832189 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b68e9f68-7281-4657-935d-4795df995d7d/alertmanager/0.log" Apr 16 18:56:16.874636 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:16.874610 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b68e9f68-7281-4657-935d-4795df995d7d/config-reloader/0.log" Apr 16 18:56:16.914776 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:16.914748 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b68e9f68-7281-4657-935d-4795df995d7d/kube-rbac-proxy-web/0.log" Apr 16 18:56:16.954032 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:16.954001 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b68e9f68-7281-4657-935d-4795df995d7d/kube-rbac-proxy/0.log" Apr 16 18:56:17.000263 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:17.000238 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b68e9f68-7281-4657-935d-4795df995d7d/kube-rbac-proxy-metric/0.log" Apr 16 18:56:17.049106 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:17.049082 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b68e9f68-7281-4657-935d-4795df995d7d/prom-label-proxy/0.log" Apr 16 18:56:17.107033 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:17.106960 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b68e9f68-7281-4657-935d-4795df995d7d/init-config-reloader/0.log" Apr 16 18:56:17.297486 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:17.297455 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5466b5fd47-bm2b5_dcc3d5f1-1012-45bf-9057-4522e50a3623/metrics-server/0.log" Apr 16 18:56:17.466972 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:17.466900 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6v66t_6ebc2f20-5e6d-4dba-aa77-26f703020588/node-exporter/0.log" Apr 16 18:56:17.490104 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:17.490084 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6v66t_6ebc2f20-5e6d-4dba-aa77-26f703020588/kube-rbac-proxy/0.log" Apr 16 18:56:17.518361 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:17.518340 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6v66t_6ebc2f20-5e6d-4dba-aa77-26f703020588/init-textfile/0.log" Apr 16 18:56:18.065133 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:18.065105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6457777b56-twkd8_c63c44d1-e7f6-4c84-a570-20f255e769fd/thanos-query/0.log" Apr 16 18:56:18.101646 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:18.101622 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6457777b56-twkd8_c63c44d1-e7f6-4c84-a570-20f255e769fd/kube-rbac-proxy-web/0.log" Apr 16 18:56:18.133526 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:18.133503 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6457777b56-twkd8_c63c44d1-e7f6-4c84-a570-20f255e769fd/kube-rbac-proxy/0.log" Apr 16 18:56:18.165493 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:18.165464 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6457777b56-twkd8_c63c44d1-e7f6-4c84-a570-20f255e769fd/prom-label-proxy/0.log" Apr 16 18:56:18.193023 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:18.193003 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6457777b56-twkd8_c63c44d1-e7f6-4c84-a570-20f255e769fd/kube-rbac-proxy-rules/0.log" Apr 16 18:56:18.220013 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:18.219993 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6457777b56-twkd8_c63c44d1-e7f6-4c84-a570-20f255e769fd/kube-rbac-proxy-metrics/0.log" Apr 16 18:56:19.622169 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:19.622144 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/2.log" Apr 16 18:56:19.626387 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:19.626367 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-8lld6_6d3a1dd4-8bf2-4a6f-8e0f-ee72ea205897/console-operator/3.log" Apr 16 18:56:19.963174 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:19.963101 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c9844876b-8xmmx_b34c8247-0103-4976-8268-fedfabee9ba7/console/0.log" Apr 16 18:56:19.989546 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:19.989519 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-fj9jn_7a21faa0-d8d7-438e-a753-354f55344b61/download-server/0.log" Apr 16 18:56:20.338136 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.338106 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22"] Apr 16 18:56:20.338446 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.338435 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="storage-initializer" Apr 16 18:56:20.338493 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.338448 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="storage-initializer" Apr 16 18:56:20.338493 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.338456 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="kserve-container" Apr 16 18:56:20.338493 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.338462 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="kserve-container" Apr 16 18:56:20.338493 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.338474 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="kserve-container" Apr 16 18:56:20.338493 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.338482 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="kserve-container" Apr 16 18:56:20.338642 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.338500 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="storage-initializer" Apr 16 18:56:20.338642 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.338506 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="storage-initializer" Apr 16 18:56:20.338642 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.338554 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4a471bc-ffc4-457c-a68f-fc06acb635be" containerName="kserve-container" Apr 16 18:56:20.338642 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.338565 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7dfe33f4-89ad-46f9-9a1b-77218902c940" containerName="kserve-container" Apr 16 18:56:20.341534 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.341518 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.343685 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.343660 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h2sf7\"/\"kube-root-ca.crt\"" Apr 16 18:56:20.343860 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.343709 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h2sf7\"/\"openshift-service-ca.crt\"" Apr 16 18:56:20.343860 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.343826 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-h2sf7\"/\"default-dockercfg-vvgvg\"" Apr 16 18:56:20.349824 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.349801 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22"] Apr 16 18:56:20.380567 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.380542 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-wqxdz_aae33711-4f13-4a51-afba-c1684da3b750/volume-data-source-validator/0.log" Apr 16 18:56:20.408177 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.408155 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1eb3224f-37ca-469d-932f-ad381686d8ec-proc\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.408268 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.408184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1eb3224f-37ca-469d-932f-ad381686d8ec-podres\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.408268 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.408225 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbbd8\" (UniqueName: \"kubernetes.io/projected/1eb3224f-37ca-469d-932f-ad381686d8ec-kube-api-access-xbbd8\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.408268 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.408253 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1eb3224f-37ca-469d-932f-ad381686d8ec-lib-modules\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.408376 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.408301 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1eb3224f-37ca-469d-932f-ad381686d8ec-sys\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.508660 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.508631 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1eb3224f-37ca-469d-932f-ad381686d8ec-sys\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.508798 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.508678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1eb3224f-37ca-469d-932f-ad381686d8ec-proc\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.508798 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.508703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1eb3224f-37ca-469d-932f-ad381686d8ec-podres\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.508798 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.508749 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1eb3224f-37ca-469d-932f-ad381686d8ec-sys\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.508798 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.508760 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1eb3224f-37ca-469d-932f-ad381686d8ec-proc\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.508933 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.508787 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbbd8\" (UniqueName: \"kubernetes.io/projected/1eb3224f-37ca-469d-932f-ad381686d8ec-kube-api-access-xbbd8\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.508933 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.508847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1eb3224f-37ca-469d-932f-ad381686d8ec-lib-modules\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.508933 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.508883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1eb3224f-37ca-469d-932f-ad381686d8ec-podres\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.509030 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.508984 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1eb3224f-37ca-469d-932f-ad381686d8ec-lib-modules\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.516080 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.516055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbbd8\" (UniqueName: \"kubernetes.io/projected/1eb3224f-37ca-469d-932f-ad381686d8ec-kube-api-access-xbbd8\") pod \"perf-node-gather-daemonset-ndf22\" (UID: \"1eb3224f-37ca-469d-932f-ad381686d8ec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.651797 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.651682 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:20.975901 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:20.975876 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22"] Apr 16 18:56:20.978345 ip-10-0-140-1 kubenswrapper[2577]: W0416 18:56:20.978315 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1eb3224f_37ca_469d_932f_ad381686d8ec.slice/crio-8483dffa6134efa60f28293962b12a0b8a2e972c20dc0d4b1524232967276704 WatchSource:0}: Error finding container 8483dffa6134efa60f28293962b12a0b8a2e972c20dc0d4b1524232967276704: Status 404 returned error can't find the container with id 8483dffa6134efa60f28293962b12a0b8a2e972c20dc0d4b1524232967276704 Apr 16 18:56:21.025068 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:21.025048 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ldpjc_2d4ed685-8585-4063-a50d-bab899fa550e/dns/0.log" Apr 16 18:56:21.044240 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:21.044222 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ldpjc_2d4ed685-8585-4063-a50d-bab899fa550e/kube-rbac-proxy/0.log" Apr 16 18:56:21.107629 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:21.107603 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7jl4x_b86bb118-f0ab-4605-860a-df81a23f9124/dns-node-resolver/0.log" Apr 16 18:56:21.519552 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:21.519522 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-84b8f69b7d-8pb5p_689117e1-30ad-4535-910e-895627fda928/registry/0.log" Apr 16 18:56:21.525249 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:21.525225 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-84b8f69b7d-8pb5p_689117e1-30ad-4535-910e-895627fda928/registry/1.log" Apr 16 18:56:21.570971 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:21.570952 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7rdcc_8f66e95f-32ea-4c62-b967-18110b01aac3/node-ca/0.log" Apr 16 18:56:21.906039 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:21.906000 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" event={"ID":"1eb3224f-37ca-469d-932f-ad381686d8ec","Type":"ContainerStarted","Data":"8660082f5b46a6bba1d50c542466667467e3214a7886f29defd6421ce6a03773"} Apr 16 18:56:21.906039 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:21.906041 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" event={"ID":"1eb3224f-37ca-469d-932f-ad381686d8ec","Type":"ContainerStarted","Data":"8483dffa6134efa60f28293962b12a0b8a2e972c20dc0d4b1524232967276704"} Apr 16 18:56:21.906468 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:21.906096 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:21.923631 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:21.923586 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" podStartSLOduration=1.923574114 podStartE2EDuration="1.923574114s" podCreationTimestamp="2026-04-16 18:56:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:56:21.922463884 +0000 UTC m=+1543.867894605" watchObservedRunningTime="2026-04-16 18:56:21.923574114 +0000 UTC m=+1543.869004824" Apr 16 18:56:22.257417 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:22.257326 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5646c9b8dd-rn92g_d2936037-2326-466f-9946-5ddb752141d0/router/0.log" Apr 16 18:56:22.594407 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:22.594370 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mstsd_5276ac45-8e09-409e-989a-d2ebdd40a1a4/serve-healthcheck-canary/0.log" Apr 16 18:56:22.956185 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:22.956105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-fq9mb_48eda739-7e21-4258-b417-fc943a77343a/insights-operator/0.log" Apr 16 18:56:22.956534 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:22.956304 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-fq9mb_48eda739-7e21-4258-b417-fc943a77343a/insights-operator/1.log" Apr 16 18:56:23.041280 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:23.041257 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9s2zk_c6f20480-57b7-41d9-b9fb-06c81c82803c/kube-rbac-proxy/0.log" Apr 16 18:56:23.060386 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:23.060364 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9s2zk_c6f20480-57b7-41d9-b9fb-06c81c82803c/exporter/0.log" Apr 16 18:56:23.080075 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:23.080054 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9s2zk_c6f20480-57b7-41d9-b9fb-06c81c82803c/extractor/0.log" Apr 16 18:56:25.043730 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:25.043701 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-74rb9_67781650-8927-4cc0-9570-d23f75a9eed9/manager/0.log" Apr 16 18:56:25.066619 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:25.066597 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-twwgv_16312cc7-ba8e-496d-99ce-c12255f40602/server/0.log" Apr 16 18:56:27.918694 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:27.918664 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-ndf22" Apr 16 18:56:29.208190 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:29.208157 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-2m45b_6514a34b-67e6-4daf-a518-91f2a7316066/kube-storage-version-migrator-operator/1.log" Apr 16 18:56:29.209125 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:29.209106 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-2m45b_6514a34b-67e6-4daf-a518-91f2a7316066/kube-storage-version-migrator-operator/0.log" Apr 16 18:56:30.311206 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:30.311124 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9fkg9_c4d0bf99-a3c2-47cc-acdf-4ffec50d8ba3/kube-multus/0.log" Apr 16 18:56:30.337641 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:30.337607 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49rzx_f7ecb9d0-5eb7-46c9-b65f-725014636854/kube-multus-additional-cni-plugins/0.log" Apr 16 18:56:30.358365 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:30.358333 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49rzx_f7ecb9d0-5eb7-46c9-b65f-725014636854/egress-router-binary-copy/0.log" Apr 16 18:56:30.378348 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:30.378320 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49rzx_f7ecb9d0-5eb7-46c9-b65f-725014636854/cni-plugins/0.log" Apr 16 18:56:30.399357 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:30.399331 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49rzx_f7ecb9d0-5eb7-46c9-b65f-725014636854/bond-cni-plugin/0.log" Apr 16 18:56:30.419191 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:30.419168 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49rzx_f7ecb9d0-5eb7-46c9-b65f-725014636854/routeoverride-cni/0.log" Apr 16 18:56:30.440584 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:30.440562 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49rzx_f7ecb9d0-5eb7-46c9-b65f-725014636854/whereabouts-cni-bincopy/0.log" Apr 16 18:56:30.460670 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:30.460640 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-49rzx_f7ecb9d0-5eb7-46c9-b65f-725014636854/whereabouts-cni/0.log" Apr 16 18:56:30.962735 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:30.962703 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tldk9_70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1/network-metrics-daemon/0.log" Apr 16 18:56:30.985824 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:30.985801 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tldk9_70f7edcb-5c76-4d9d-b5ed-8a4093c9dda1/kube-rbac-proxy/0.log" Apr 16 18:56:32.065874 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:32.065787 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-controller/0.log" Apr 16 18:56:32.102725 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:32.102706 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/0.log" Apr 16 18:56:32.109207 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:32.109188 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovn-acl-logging/1.log" Apr 16 18:56:32.139098 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:32.139074 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/kube-rbac-proxy-node/0.log" Apr 16 18:56:32.181707 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:32.181683 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:56:32.221733 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:32.221713 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/northd/0.log" Apr 16 18:56:32.259555 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:32.259530 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/nbdb/0.log" Apr 16 18:56:32.297367 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:32.297340 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/sbdb/0.log" Apr 16 18:56:32.390301 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:32.390271 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dsp8f_b18d081b-3d3f-48e8-8f52-8eb619b60b77/ovnkube-controller/0.log" Apr 16 18:56:33.939255 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:33.939228 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-57qhk_47c264de-a221-4aa7-8732-5a2e31ec7974/network-check-target-container/0.log" Apr 16 18:56:34.908703 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:34.908677 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-mmhj6_6194e8f3-e97c-49da-8ebb-4764a9a77850/iptables-alerter/0.log" Apr 16 18:56:35.555410 ip-10-0-140-1 kubenswrapper[2577]: I0416 18:56:35.555385 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-lgzq4_6e091b04-e5d1-4928-9203-5358e7ad1e2a/tuned/0.log"