Apr 17 07:49:44.169539 ip-10-0-138-233 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 07:49:44.169553 ip-10-0-138-233 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 07:49:44.169564 ip-10-0-138-233 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 07:49:44.169875 ip-10-0-138-233 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 07:49:54.177819 ip-10-0-138-233 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 07:49:54.177839 ip-10-0-138-233 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 36620611445c4247a8359e67fc110aaf -- Apr 17 07:52:20.062611 ip-10-0-138-233 systemd[1]: Starting Kubernetes Kubelet... Apr 17 07:52:20.545645 ip-10-0-138-233 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:52:20.545645 ip-10-0-138-233 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 07:52:20.545645 ip-10-0-138-233 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:52:20.545645 ip-10-0-138-233 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 07:52:20.545645 ip-10-0-138-233 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 07:52:20.548191 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.548100 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 07:52:20.552109 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552094 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:20.552109 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552109 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552113 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552117 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552120 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552123 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552126 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552129 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552134 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552137 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552140 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552143 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552146 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552148 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552151 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552154 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552156 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552160 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552164 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552167 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552170 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:20.552170 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552173 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552176 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552178 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552182 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552185 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552188 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552191 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552193 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552196 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552199 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552201 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552204 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552206 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552209 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552212 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552215 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552217 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552220 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552222 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552225 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:20.552645 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552228 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552231 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552233 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552239 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552242 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552245 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552248 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552251 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552253 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552256 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552258 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552261 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552263 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552266 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552269 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552272 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552275 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552278 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552280 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:20.553128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552283 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552286 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552288 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552291 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552293 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552296 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552299 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552302 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552305 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552308 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552323 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552326 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552329 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552331 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552353 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552373 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552377 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552380 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552387 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552390 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:20.553657 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552394 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552397 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552399 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552402 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552405 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.552408 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.553979 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.553992 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.553997 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554010 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554015 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554020 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554024 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554029 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554034 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554038 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554042 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554046 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554050 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:20.554424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554054 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554059 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554069 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554073 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554077 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554081 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554086 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554090 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554094 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554098 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554102 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554106 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554111 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554115 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554120 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554130 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554135 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554139 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554144 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554148 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:20.554918 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554152 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554156 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554161 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554165 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554169 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554173 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554177 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554181 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554190 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554194 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554199 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554203 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554207 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554211 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554215 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554220 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554224 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554228 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554233 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554240 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:20.555448 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554251 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554256 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554263 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554268 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554272 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554277 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554282 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554287 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554291 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554295 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554300 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554304 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554333 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554338 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554342 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554346 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554350 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554354 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554359 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554362 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:20.555932 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554367 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554371 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554376 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554382 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554392 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554396 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554460 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554630 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554637 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554646 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554649 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554652 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.554656 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554731 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554739 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554747 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554752 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554756 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554760 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554767 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 07:52:20.556424 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554772 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554775 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554778 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554782 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554785 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554789 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554791 2570 flags.go:64] FLAG: --cgroup-root="" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554794 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554797 2570 flags.go:64] FLAG: --client-ca-file="" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554800 2570 flags.go:64] FLAG: --cloud-config="" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554803 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554806 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554811 2570 flags.go:64] FLAG: --cluster-domain="" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554814 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554818 2570 flags.go:64] FLAG: --config-dir="" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554820 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554824 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554828 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554832 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554835 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554839 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554842 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554845 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554848 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554851 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 07:52:20.556905 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554854 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554859 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554862 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554865 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554868 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554871 2570 flags.go:64] FLAG: --enable-server="true" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554874 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554879 2570 flags.go:64] FLAG: --event-burst="100" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554882 2570 flags.go:64] FLAG: --event-qps="50" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554885 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554888 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554891 2570 flags.go:64] FLAG: --eviction-hard="" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554894 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554897 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554900 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554903 2570 flags.go:64] FLAG: --eviction-soft="" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554906 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554909 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554912 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554915 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554919 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554922 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554925 2570 flags.go:64] FLAG: --feature-gates="" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554929 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554933 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 07:52:20.557526 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554936 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554940 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554944 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554947 2570 flags.go:64] FLAG: --help="false" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554950 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-138-233.ec2.internal" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554953 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554956 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554959 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554963 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554966 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554969 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554972 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554975 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554978 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554981 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554985 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554987 2570 flags.go:64] FLAG: --kube-reserved="" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554990 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554993 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554996 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.554999 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555001 2570 flags.go:64] FLAG: --lock-file="" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555005 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555008 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 07:52:20.558112 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555010 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555016 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555019 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555022 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555025 2570 flags.go:64] FLAG: --logging-format="text" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555028 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555032 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555035 2570 flags.go:64] FLAG: --manifest-url="" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555038 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555043 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555048 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555052 2570 flags.go:64] FLAG: --max-pods="110" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555055 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555058 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555061 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555064 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555067 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555070 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555073 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555081 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555084 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555087 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555090 2570 flags.go:64] FLAG: --pod-cidr="" Apr 17 07:52:20.558725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555093 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555098 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555101 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555104 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555107 2570 flags.go:64] FLAG: --port="10250" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555110 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555113 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e6720cd6fd369343" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555116 2570 flags.go:64] FLAG: --qos-reserved="" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555119 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555123 2570 flags.go:64] FLAG: --register-node="true" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555125 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555128 2570 flags.go:64] FLAG: --register-with-taints="" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555132 2570 flags.go:64] FLAG: --registry-burst="10" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555135 2570 flags.go:64] FLAG: --registry-qps="5" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555137 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555141 2570 flags.go:64] FLAG: --reserved-memory="" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555145 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555148 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555151 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555155 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555158 2570 flags.go:64] FLAG: --runonce="false" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555161 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555164 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555167 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555170 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555173 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 07:52:20.559263 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555176 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555179 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555183 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555185 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555188 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555191 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555194 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555196 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555199 2570 flags.go:64] FLAG: --system-cgroups="" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555202 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555209 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555212 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555215 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555219 2570 flags.go:64] FLAG: --tls-min-version="" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555222 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555225 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555228 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555231 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555234 2570 flags.go:64] FLAG: --v="2" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555239 2570 flags.go:64] FLAG: --version="false" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555243 2570 flags.go:64] FLAG: --vmodule="" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555248 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.555251 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555359 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:20.559920 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555364 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555367 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555370 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555373 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555376 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555379 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555382 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555384 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555387 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555391 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555395 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555398 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555401 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555404 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555406 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555409 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555412 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555414 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555418 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:20.560503 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555420 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555424 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555426 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555429 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555432 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555435 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555438 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555440 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555443 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555445 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555448 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555451 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555454 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555457 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555459 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555462 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555465 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555469 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555473 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555476 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:20.560977 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555479 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555482 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555485 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555487 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555490 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555492 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555495 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555497 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555500 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555503 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555505 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555508 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555511 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555513 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555516 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555519 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555521 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555523 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555526 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555529 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:20.561513 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555531 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555534 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555536 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555539 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555541 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555544 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555547 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555549 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555552 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555554 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555557 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555560 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555565 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555568 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555570 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555573 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555575 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555578 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555580 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555583 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:20.562004 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555585 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:20.562501 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555588 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:20.562501 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555590 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:20.562501 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555596 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:20.562501 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555598 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:20.562501 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.555601 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:20.562501 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.556244 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:52:20.562660 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.562627 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 07:52:20.562660 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.562642 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 07:52:20.562715 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562685 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:20.562715 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562689 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:20.562715 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562693 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:20.562715 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562696 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:20.562715 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562698 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:20.562715 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562701 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:20.562715 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562704 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:20.562715 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562707 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:20.562715 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562709 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:20.562715 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562712 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:20.562715 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562715 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:20.562715 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562718 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:20.562715 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562721 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562725 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562728 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562731 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562734 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562736 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562739 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562741 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562744 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562747 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562749 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562752 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562754 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562757 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562760 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562763 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562765 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562768 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562770 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562773 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:20.563033 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562775 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562778 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562781 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562783 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562786 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562788 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562790 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562793 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562795 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562798 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562800 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562803 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562805 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562808 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562812 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562815 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562817 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562821 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562825 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562828 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:20.563558 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562832 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562834 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562837 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562840 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562843 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562846 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562849 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562851 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562854 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562856 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562859 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562862 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562865 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562869 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562872 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562874 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562877 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562879 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562882 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:20.564041 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562884 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562887 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562889 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562892 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562894 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562897 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562899 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562902 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562906 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562909 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562911 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562914 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562917 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562919 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.562922 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.562927 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:52:20.564504 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563021 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563025 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563029 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563032 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563035 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563038 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563041 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563044 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563047 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563049 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563052 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563055 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563057 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563060 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563063 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563065 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563068 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563070 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563073 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563075 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 07:52:20.564898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563078 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563080 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563083 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563085 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563088 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563091 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563095 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563099 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563101 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563104 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563107 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563110 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563112 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563115 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563118 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563121 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563123 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563126 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563128 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 07:52:20.565388 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563131 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563133 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563136 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563138 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563141 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563144 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563146 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563149 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563151 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563154 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563156 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563158 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563161 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563163 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563166 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563168 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563171 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563173 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563176 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563179 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 07:52:20.565828 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563182 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563184 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563186 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563189 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563192 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563195 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563197 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563200 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563202 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563205 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563207 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563210 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563212 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563215 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563217 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563219 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563222 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563225 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563227 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563230 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 07:52:20.566479 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563234 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 07:52:20.567008 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563237 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 07:52:20.567008 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563239 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 07:52:20.567008 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563242 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 07:52:20.567008 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563244 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 07:52:20.567008 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563247 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 07:52:20.567008 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:20.563249 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 07:52:20.567008 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.563255 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 07:52:20.567008 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.563993 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 07:52:20.567008 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.566122 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 07:52:20.567306 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.567077 2570 server.go:1019] "Starting client certificate rotation" Apr 17 07:52:20.567306 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.567185 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:52:20.567306 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.567232 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 07:52:20.595147 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.595124 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:52:20.598356 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.598331 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 07:52:20.618864 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.618843 2570 log.go:25] "Validated CRI v1 runtime API" Apr 17 07:52:20.621963 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.621943 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:52:20.624787 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.624774 2570 log.go:25] "Validated CRI v1 image API" Apr 17 07:52:20.626647 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.626632 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 07:52:20.632025 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.632000 2570 fs.go:135] Filesystem UUIDs: map[32d1eaf4-b8bc-4c59-bd9f-1c593771c420:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 dfcee744-b33f-470f-8f1c-f6097abbd990:/dev/nvme0n1p4] Apr 17 07:52:20.632098 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.632026 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 07:52:20.638364 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.638344 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-22kg9" Apr 17 07:52:20.639109 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.638990 2570 manager.go:217] Machine: {Timestamp:2026-04-17 07:52:20.6367522 +0000 UTC m=+0.445762921 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3077426 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2586c7c046553cedefdc85772ddb4b SystemUUID:ec2586c7-c046-553c-edef-dc85772ddb4b BootID:36620611-445c-4247-a835-9e67fc110aaf Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:cc:e2:11:3b:9f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:cc:e2:11:3b:9f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f2:db:bd:a4:00:ca Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 07:52:20.639109 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.639104 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 07:52:20.639258 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.639220 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 07:52:20.640473 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.640450 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 07:52:20.640663 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.640476 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-233.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 07:52:20.640746 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.640678 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 07:52:20.640746 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.640689 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 07:52:20.640746 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.640715 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:52:20.642666 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.642653 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 07:52:20.644120 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.644107 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:52:20.644253 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.644242 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 07:52:20.644997 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.644982 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-22kg9" Apr 17 07:52:20.646846 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.646834 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 17 07:52:20.646883 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.646851 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 07:52:20.646883 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.646866 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 07:52:20.646883 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.646875 2570 kubelet.go:397] "Adding apiserver pod source" Apr 17 07:52:20.646883 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.646883 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 07:52:20.647990 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.647976 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:52:20.648037 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.648004 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 07:52:20.651195 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.651175 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 07:52:20.652522 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.652509 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 07:52:20.654453 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.654438 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 07:52:20.654529 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.654457 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 07:52:20.654529 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.654466 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 07:52:20.654529 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.654472 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 07:52:20.654529 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.654477 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 07:52:20.654529 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.654486 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 07:52:20.654529 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.654495 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 07:52:20.654529 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.654500 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 07:52:20.654529 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.654508 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 07:52:20.654529 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.654514 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 07:52:20.654529 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.654524 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 07:52:20.654529 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.654533 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 07:52:20.655412 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.655401 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 07:52:20.655412 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.655412 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 07:52:20.657977 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.657961 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:20.659224 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.659210 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 07:52:20.659271 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.659256 2570 server.go:1295] "Started kubelet" Apr 17 07:52:20.659398 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.659350 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 07:52:20.659485 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.659437 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 07:52:20.659539 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.659516 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 07:52:20.659756 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.659743 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:20.660199 ip-10-0-138-233 systemd[1]: Started Kubernetes Kubelet. Apr 17 07:52:20.661308 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.661192 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 07:52:20.661717 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.661701 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 17 07:52:20.663303 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.663288 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-233.ec2.internal" not found Apr 17 07:52:20.668190 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.668159 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 07:52:20.668662 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:20.668641 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 07:52:20.668764 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.668750 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 07:52:20.669407 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.669390 2570 factory.go:55] Registering systemd factory Apr 17 07:52:20.669479 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.669411 2570 factory.go:223] Registration of the systemd container factory successfully Apr 17 07:52:20.669763 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:20.669744 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-233.ec2.internal\" not found" Apr 17 07:52:20.669812 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.669782 2570 factory.go:153] Registering CRI-O factory Apr 17 07:52:20.669812 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.669793 2570 factory.go:223] Registration of the crio container factory successfully Apr 17 07:52:20.669883 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.669828 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 07:52:20.669883 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.669831 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 07:52:20.669883 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.669848 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 07:52:20.669883 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.669856 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 07:52:20.670055 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.669900 2570 factory.go:103] Registering Raw factory Apr 17 07:52:20.670055 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.669935 2570 manager.go:1196] Started watching for new ooms in manager Apr 17 07:52:20.670055 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.669938 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 17 07:52:20.670055 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.669954 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 17 07:52:20.670392 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.670380 2570 manager.go:319] Starting recovery of all containers Apr 17 07:52:20.671058 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.671040 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:20.674544 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:20.674517 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-233.ec2.internal\" not found" node="ip-10-0-138-233.ec2.internal" Apr 17 07:52:20.679001 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.678841 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-233.ec2.internal" not found Apr 17 07:52:20.680558 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.680543 2570 manager.go:324] Recovery completed Apr 17 07:52:20.685222 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.685208 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:20.689103 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.689088 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-233.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:20.689169 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.689119 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-233.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:20.689169 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.689132 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-233.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:20.689620 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.689607 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 07:52:20.689667 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.689620 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 07:52:20.689667 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.689639 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 17 07:52:20.692180 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.692169 2570 policy_none.go:49] "None policy: Start" Apr 17 07:52:20.692222 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.692185 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 07:52:20.692222 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.692194 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 17 07:52:20.735877 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.735860 2570 manager.go:341] "Starting Device Plugin manager" Apr 17 07:52:20.735982 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:20.735898 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 07:52:20.735982 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.735909 2570 server.go:85] "Starting device plugin registration server" Apr 17 07:52:20.735982 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.735939 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-233.ec2.internal" not found Apr 17 07:52:20.736147 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.736134 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 07:52:20.736195 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.736150 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 07:52:20.736266 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.736247 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 07:52:20.736389 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.736345 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 07:52:20.736389 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.736353 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 07:52:20.736980 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:20.736961 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 07:52:20.737047 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:20.737000 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-233.ec2.internal\" not found" Apr 17 07:52:20.822918 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.822844 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 07:52:20.824152 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.824134 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 07:52:20.824232 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.824163 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 07:52:20.824232 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.824182 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 07:52:20.824232 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.824188 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 07:52:20.824232 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:20.824220 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 07:52:20.826728 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.826712 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:20.836534 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.836510 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 07:52:20.837471 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.837456 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-233.ec2.internal" event="NodeHasSufficientMemory" Apr 17 07:52:20.837538 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.837492 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-233.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 07:52:20.837538 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.837503 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-233.ec2.internal" event="NodeHasSufficientPID" Apr 17 07:52:20.837538 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.837530 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-233.ec2.internal" Apr 17 07:52:20.845717 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.845697 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-233.ec2.internal" Apr 17 07:52:20.924468 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.924424 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-233.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal"] Apr 17 07:52:20.926777 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.926762 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal" Apr 17 07:52:20.926862 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.926762 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-233.ec2.internal" Apr 17 07:52:20.960982 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.960961 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal" Apr 17 07:52:20.965441 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.965427 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-233.ec2.internal" Apr 17 07:52:20.971154 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.971140 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/169311e9e20ae28d37e18c3438cd6482-config\") pod \"kube-apiserver-proxy-ip-10-0-138-233.ec2.internal\" (UID: \"169311e9e20ae28d37e18c3438cd6482\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-233.ec2.internal" Apr 17 07:52:20.971210 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.971164 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1986491dc9fe18f2259782ab504301ca-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal\" (UID: \"1986491dc9fe18f2259782ab504301ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal" Apr 17 07:52:20.971210 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.971184 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1986491dc9fe18f2259782ab504301ca-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal\" (UID: \"1986491dc9fe18f2259782ab504301ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal" Apr 17 07:52:20.974146 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.974132 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:52:20.978558 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:20.978540 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 07:52:21.072072 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.072049 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/169311e9e20ae28d37e18c3438cd6482-config\") pod \"kube-apiserver-proxy-ip-10-0-138-233.ec2.internal\" (UID: \"169311e9e20ae28d37e18c3438cd6482\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-233.ec2.internal" Apr 17 07:52:21.072176 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.072074 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1986491dc9fe18f2259782ab504301ca-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal\" (UID: \"1986491dc9fe18f2259782ab504301ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal" Apr 17 07:52:21.072176 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.072093 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1986491dc9fe18f2259782ab504301ca-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal\" (UID: \"1986491dc9fe18f2259782ab504301ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal" Apr 17 07:52:21.072176 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.072141 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1986491dc9fe18f2259782ab504301ca-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal\" (UID: \"1986491dc9fe18f2259782ab504301ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal" Apr 17 07:52:21.072176 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.072146 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/169311e9e20ae28d37e18c3438cd6482-config\") pod \"kube-apiserver-proxy-ip-10-0-138-233.ec2.internal\" (UID: \"169311e9e20ae28d37e18c3438cd6482\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-233.ec2.internal" Apr 17 07:52:21.072299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.072180 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1986491dc9fe18f2259782ab504301ca-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal\" (UID: \"1986491dc9fe18f2259782ab504301ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal" Apr 17 07:52:21.276733 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.276702 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal" Apr 17 07:52:21.282130 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.282110 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-233.ec2.internal" Apr 17 07:52:21.566964 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.566890 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 07:52:21.567573 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.567041 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:52:21.567573 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.567041 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:52:21.567573 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.567042 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 07:52:21.647195 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.647151 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 07:47:20 +0000 UTC" deadline="2027-10-03 08:24:39.364225822 +0000 UTC" Apr 17 07:52:21.647195 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.647176 2570 apiserver.go:52] "Watching apiserver" Apr 17 07:52:21.647428 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.647183 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12816h32m17.717045008s" Apr 17 07:52:21.653579 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.653561 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 07:52:21.656010 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.655861 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-dh54x","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal","openshift-multus/multus-5r6l7","openshift-multus/network-metrics-daemon-7bs5q","openshift-network-operator/iptables-alerter-xl7sg","openshift-ovn-kubernetes/ovnkube-node-x6ppg","kube-system/konnectivity-agent-4z55l","kube-system/kube-apiserver-proxy-ip-10-0-138-233.ec2.internal","openshift-cluster-node-tuning-operator/tuned-wh74p","openshift-dns/node-resolver-5v2tv","openshift-image-registry/node-ca-4tp7h","openshift-multus/multus-additional-cni-plugins-jgncd"] Apr 17 07:52:21.657379 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.657361 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:21.657467 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:21.657419 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:21.659556 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.659535 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.660597 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.660576 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.660685 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.660652 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:21.660768 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:21.660727 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:21.661439 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.661419 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bx2wj\"" Apr 17 07:52:21.661614 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.661597 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 07:52:21.661614 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.661608 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 07:52:21.661761 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.661649 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 07:52:21.662296 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.662279 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xl7sg" Apr 17 07:52:21.662611 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.662588 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 07:52:21.662692 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.662641 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 07:52:21.662743 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.662701 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xjh45\"" Apr 17 07:52:21.662923 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.662909 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 07:52:21.662979 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.662965 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 07:52:21.663682 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.663666 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.664204 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.664184 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:52:21.664556 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.664538 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 07:52:21.664648 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.664631 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-m2q46\"" Apr 17 07:52:21.664688 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.664652 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 07:52:21.664957 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.664946 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4z55l" Apr 17 07:52:21.665446 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.665429 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 07:52:21.665528 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.665503 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 07:52:21.665800 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.665780 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 07:52:21.665888 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.665806 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 07:52:21.665888 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.665813 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 07:52:21.666140 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.666127 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.667389 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.666823 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-xdwvs\"" Apr 17 07:52:21.667389 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.666921 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 07:52:21.667680 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.667651 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 07:52:21.668116 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.668096 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d2rqr\"" Apr 17 07:52:21.668308 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.668109 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 07:52:21.668697 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.668676 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 07:52:21.668883 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.668853 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5v2tv" Apr 17 07:52:21.669096 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.669075 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:52:21.669651 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.669631 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 07:52:21.669952 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.669881 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ft4ks\"" Apr 17 07:52:21.671237 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.671220 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 07:52:21.671445 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.671430 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 07:52:21.671546 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.671529 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4mb8j\"" Apr 17 07:52:21.671699 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.671681 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4tp7h" Apr 17 07:52:21.673274 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673249 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.673712 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673692 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 07:52:21.673788 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673714 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wqc64\"" Apr 17 07:52:21.673788 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673711 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-multus-conf-dir\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.673788 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673765 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/236582ca-8db3-417e-b2cf-dc0053b8afcf-tmp-dir\") pod \"node-resolver-5v2tv\" (UID: \"236582ca-8db3-417e-b2cf-dc0053b8afcf\") " pod="openshift-dns/node-resolver-5v2tv" Apr 17 07:52:21.673788 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673772 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 07:52:21.673978 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673792 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-run-ovn\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.673978 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673817 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdblz\" (UniqueName: \"kubernetes.io/projected/21b5c294-9caa-41ea-8bd0-357a8981ec9b-kube-api-access-cdblz\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.673978 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673842 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-sysconfig\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.673978 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673864 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-kubernetes\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.673978 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673870 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 07:52:21.673978 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673890 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-cnibin\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.673978 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673915 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-run-netns\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.673978 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673939 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbqvb\" (UniqueName: \"kubernetes.io/projected/c6a74de4-3b80-4092-8de3-2a795216aa48-kube-api-access-lbqvb\") pod \"node-ca-4tp7h\" (UID: \"c6a74de4-3b80-4092-8de3-2a795216aa48\") " pod="openshift-image-registry/node-ca-4tp7h" Apr 17 07:52:21.673978 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673962 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-var-lib-openvswitch\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.673985 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-host\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674006 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674049 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-multus-cni-dir\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674082 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwpn5\" (UniqueName: \"kubernetes.io/projected/18767c55-78c8-48a0-ae7f-c2c09ebac544-kube-api-access-gwpn5\") pod \"iptables-alerter-xl7sg\" (UID: \"18767c55-78c8-48a0-ae7f-c2c09ebac544\") " pod="openshift-network-operator/iptables-alerter-xl7sg" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674112 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2sqc\" (UniqueName: \"kubernetes.io/projected/236582ca-8db3-417e-b2cf-dc0053b8afcf-kube-api-access-r2sqc\") pod \"node-resolver-5v2tv\" (UID: \"236582ca-8db3-417e-b2cf-dc0053b8afcf\") " pod="openshift-dns/node-resolver-5v2tv" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674139 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674163 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21b5c294-9caa-41ea-8bd0-357a8981ec9b-ovnkube-config\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674192 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-multus-socket-dir-parent\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674215 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-var-lib-cni-bin\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674243 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb8jk\" (UniqueName: \"kubernetes.io/projected/6eb6689c-0d36-4771-bcac-8118455cada4-kube-api-access-fb8jk\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674272 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21b5c294-9caa-41ea-8bd0-357a8981ec9b-env-overrides\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674289 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21b5c294-9caa-41ea-8bd0-357a8981ec9b-ovn-node-metrics-cert\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674304 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-sys-fs\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674337 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-var-lib-cni-multus\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674352 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6eb6689c-0d36-4771-bcac-8118455cada4-multus-daemon-config\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.674363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674366 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/18767c55-78c8-48a0-ae7f-c2c09ebac544-host-slash\") pod \"iptables-alerter-xl7sg\" (UID: \"18767c55-78c8-48a0-ae7f-c2c09ebac544\") " pod="openshift-network-operator/iptables-alerter-xl7sg" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674381 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/236582ca-8db3-417e-b2cf-dc0053b8afcf-hosts-file\") pod \"node-resolver-5v2tv\" (UID: \"236582ca-8db3-417e-b2cf-dc0053b8afcf\") " pod="openshift-dns/node-resolver-5v2tv" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674403 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-run-systemd\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674427 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/34e79f54-de56-4c95-814b-f41296cabe3d-konnectivity-ca\") pod \"konnectivity-agent-4z55l\" (UID: \"34e79f54-de56-4c95-814b-f41296cabe3d\") " pod="kube-system/konnectivity-agent-4z55l" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674442 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-systemd\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674460 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-os-release\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674483 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-run-k8s-cni-cncf-io\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674500 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21b5c294-9caa-41ea-8bd0-357a8981ec9b-ovnkube-script-lib\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674517 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-var-lib-kubelet\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674531 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c6a74de4-3b80-4092-8de3-2a795216aa48-serviceca\") pod \"node-ca-4tp7h\" (UID: \"c6a74de4-3b80-4092-8de3-2a795216aa48\") " pod="openshift-image-registry/node-ca-4tp7h" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674549 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-etc-openvswitch\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674580 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-etc-selinux\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674604 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-system-cni-dir\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674630 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/18767c55-78c8-48a0-ae7f-c2c09ebac544-iptables-alerter-script\") pod \"iptables-alerter-xl7sg\" (UID: \"18767c55-78c8-48a0-ae7f-c2c09ebac544\") " pod="openshift-network-operator/iptables-alerter-xl7sg" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674657 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-slash\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674672 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-run-netns\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674699 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-modprobe-d\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.674894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674723 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/02603763-8df0-4492-8876-283310358655-etc-tuned\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674748 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpnrx\" (UniqueName: \"kubernetes.io/projected/02603763-8df0-4492-8876-283310358655-kube-api-access-lpnrx\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674765 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6a74de4-3b80-4092-8de3-2a795216aa48-host\") pod \"node-ca-4tp7h\" (UID: \"c6a74de4-3b80-4092-8de3-2a795216aa48\") " pod="openshift-image-registry/node-ca-4tp7h" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674784 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-cni-bin\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674799 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6eb6689c-0d36-4771-bcac-8118455cada4-cni-binary-copy\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674819 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-run-multus-certs\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674838 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-systemd-units\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674861 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-cni-netd\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674891 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-device-dir\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674909 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-sys\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674924 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-socket-dir\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674944 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-etc-kubernetes\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674981 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-node-log\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.674991 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675000 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-run-openvswitch\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675014 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/34e79f54-de56-4c95-814b-f41296cabe3d-agent-certs\") pod \"konnectivity-agent-4z55l\" (UID: \"34e79f54-de56-4c95-814b-f41296cabe3d\") " pod="kube-system/konnectivity-agent-4z55l" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675034 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-sysctl-d\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675086 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 07:52:21.675545 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675088 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-lib-modules\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675150 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4vlb\" (UniqueName: \"kubernetes.io/projected/cadd455a-a423-45c3-9621-616ec07f359b-kube-api-access-k4vlb\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675167 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-kubelet\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675181 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-sysctl-conf\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675204 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcfw4\" (UniqueName: \"kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4\") pod \"network-check-target-dh54x\" (UID: \"7fe9b09c-633f-48ec-916f-04b365b73fcb\") " pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675224 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-log-socket\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675240 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-var-lib-kubelet\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675271 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-registration-dir\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675289 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-sj2h5\"" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675330 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-hostroot\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675352 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675370 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g669x\" (UniqueName: \"kubernetes.io/projected/b124ed47-021a-4bde-8c03-dcfce0f301d8-kube-api-access-g669x\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675391 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675408 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-run\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.676020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.675424 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/02603763-8df0-4492-8876-283310358655-tmp\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.679825 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.679807 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 07:52:21.699060 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.699040 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jx6hj" Apr 17 07:52:21.706440 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.706423 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jx6hj" Apr 17 07:52:21.770436 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.770409 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 07:52:21.775802 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.775778 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-kubelet\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.775902 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.775814 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-sysctl-conf\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.775902 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.775829 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-kubelet\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.775902 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.775842 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcfw4\" (UniqueName: \"kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4\") pod \"network-check-target-dh54x\" (UID: \"7fe9b09c-633f-48ec-916f-04b365b73fcb\") " pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:21.775902 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.775865 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-log-socket\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.775902 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.775880 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-var-lib-kubelet\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.775902 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.775896 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-registration-dir\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.776194 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.775914 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-hostroot\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.776194 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.775937 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:21.776194 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.775952 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-log-socket\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.776194 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.775961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g669x\" (UniqueName: \"kubernetes.io/projected/b124ed47-021a-4bde-8c03-dcfce0f301d8-kube-api-access-g669x\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:21.776194 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776003 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-var-lib-kubelet\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.776194 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776020 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.776194 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.775959 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-sysctl-conf\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.776194 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776019 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-hostroot\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.776194 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:21.776079 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:21.776194 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776093 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.776194 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776095 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-registration-dir\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.776194 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776128 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-run\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.776194 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776167 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-run\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:21.776210 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs podName:b124ed47-021a-4bde-8c03-dcfce0f301d8 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:22.276135932 +0000 UTC m=+2.085146659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs") pod "network-metrics-daemon-7bs5q" (UID: "b124ed47-021a-4bde-8c03-dcfce0f301d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776247 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/02603763-8df0-4492-8876-283310358655-tmp\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776289 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-multus-conf-dir\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776330 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/236582ca-8db3-417e-b2cf-dc0053b8afcf-tmp-dir\") pod \"node-resolver-5v2tv\" (UID: \"236582ca-8db3-417e-b2cf-dc0053b8afcf\") " pod="openshift-dns/node-resolver-5v2tv" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-run-ovn\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776360 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-multus-conf-dir\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776392 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-run-ovn\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776453 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdblz\" (UniqueName: \"kubernetes.io/projected/21b5c294-9caa-41ea-8bd0-357a8981ec9b-kube-api-access-cdblz\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-sysconfig\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776490 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-kubernetes\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776505 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-cnibin\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776546 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-sysconfig\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776559 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-cnibin\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776562 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-kubernetes\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776605 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-run-netns\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776643 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbqvb\" (UniqueName: \"kubernetes.io/projected/c6a74de4-3b80-4092-8de3-2a795216aa48-kube-api-access-lbqvb\") pod \"node-ca-4tp7h\" (UID: \"c6a74de4-3b80-4092-8de3-2a795216aa48\") " pod="openshift-image-registry/node-ca-4tp7h" Apr 17 07:52:21.776672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776654 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776670 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-var-lib-openvswitch\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776669 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-run-netns\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-host\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776729 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-var-lib-openvswitch\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776754 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/236582ca-8db3-417e-b2cf-dc0053b8afcf-tmp-dir\") pod \"node-resolver-5v2tv\" (UID: \"236582ca-8db3-417e-b2cf-dc0053b8afcf\") " pod="openshift-dns/node-resolver-5v2tv" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776766 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776749 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-host\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776800 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-multus-cni-dir\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776817 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776826 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwpn5\" (UniqueName: \"kubernetes.io/projected/18767c55-78c8-48a0-ae7f-c2c09ebac544-kube-api-access-gwpn5\") pod \"iptables-alerter-xl7sg\" (UID: \"18767c55-78c8-48a0-ae7f-c2c09ebac544\") " pod="openshift-network-operator/iptables-alerter-xl7sg" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776855 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-multus-cni-dir\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776863 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2sqc\" (UniqueName: \"kubernetes.io/projected/236582ca-8db3-417e-b2cf-dc0053b8afcf-kube-api-access-r2sqc\") pod \"node-resolver-5v2tv\" (UID: \"236582ca-8db3-417e-b2cf-dc0053b8afcf\") " pod="openshift-dns/node-resolver-5v2tv" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776903 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b042621c-fbf8-4739-b758-0e481535b940-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776943 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776971 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21b5c294-9caa-41ea-8bd0-357a8981ec9b-ovnkube-config\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.776996 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-multus-socket-dir-parent\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.777533 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777023 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777053 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-multus-socket-dir-parent\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777076 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-var-lib-cni-bin\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777140 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fb8jk\" (UniqueName: \"kubernetes.io/projected/6eb6689c-0d36-4771-bcac-8118455cada4-kube-api-access-fb8jk\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777168 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21b5c294-9caa-41ea-8bd0-357a8981ec9b-env-overrides\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777178 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-var-lib-cni-bin\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777193 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21b5c294-9caa-41ea-8bd0-357a8981ec9b-ovn-node-metrics-cert\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777219 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-sys-fs\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777242 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-var-lib-cni-multus\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777279 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6eb6689c-0d36-4771-bcac-8118455cada4-multus-daemon-config\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777306 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/18767c55-78c8-48a0-ae7f-c2c09ebac544-host-slash\") pod \"iptables-alerter-xl7sg\" (UID: \"18767c55-78c8-48a0-ae7f-c2c09ebac544\") " pod="openshift-network-operator/iptables-alerter-xl7sg" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777348 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/236582ca-8db3-417e-b2cf-dc0053b8afcf-hosts-file\") pod \"node-resolver-5v2tv\" (UID: \"236582ca-8db3-417e-b2cf-dc0053b8afcf\") " pod="openshift-dns/node-resolver-5v2tv" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777386 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-run-systemd\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777423 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/34e79f54-de56-4c95-814b-f41296cabe3d-konnectivity-ca\") pod \"konnectivity-agent-4z55l\" (UID: \"34e79f54-de56-4c95-814b-f41296cabe3d\") " pod="kube-system/konnectivity-agent-4z55l" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777447 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-systemd\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777472 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-os-release\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777496 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-run-k8s-cni-cncf-io\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.778299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777529 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-var-lib-cni-multus\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777526 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b042621c-fbf8-4739-b758-0e481535b940-os-release\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777575 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21b5c294-9caa-41ea-8bd0-357a8981ec9b-ovnkube-script-lib\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777585 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21b5c294-9caa-41ea-8bd0-357a8981ec9b-ovnkube-config\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777603 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-var-lib-kubelet\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777650 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c6a74de4-3b80-4092-8de3-2a795216aa48-serviceca\") pod \"node-ca-4tp7h\" (UID: \"c6a74de4-3b80-4092-8de3-2a795216aa48\") " pod="openshift-image-registry/node-ca-4tp7h" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777676 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-etc-openvswitch\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777678 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21b5c294-9caa-41ea-8bd0-357a8981ec9b-env-overrides\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777701 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-etc-selinux\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777727 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-system-cni-dir\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777734 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-var-lib-kubelet\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777751 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/18767c55-78c8-48a0-ae7f-c2c09ebac544-iptables-alerter-script\") pod \"iptables-alerter-xl7sg\" (UID: \"18767c55-78c8-48a0-ae7f-c2c09ebac544\") " pod="openshift-network-operator/iptables-alerter-xl7sg" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777777 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-run-systemd\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777778 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-slash\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777818 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-run-netns\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777843 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-modprobe-d\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777879 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/02603763-8df0-4492-8876-283310358655-etc-tuned\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.779085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777904 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpnrx\" (UniqueName: \"kubernetes.io/projected/02603763-8df0-4492-8876-283310358655-kube-api-access-lpnrx\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6a74de4-3b80-4092-8de3-2a795216aa48-host\") pod \"node-ca-4tp7h\" (UID: \"c6a74de4-3b80-4092-8de3-2a795216aa48\") " pod="openshift-image-registry/node-ca-4tp7h" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778002 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtzfs\" (UniqueName: \"kubernetes.io/projected/b042621c-fbf8-4739-b758-0e481535b940-kube-api-access-rtzfs\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778074 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-cni-bin\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778065 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/34e79f54-de56-4c95-814b-f41296cabe3d-konnectivity-ca\") pod \"konnectivity-agent-4z55l\" (UID: \"34e79f54-de56-4c95-814b-f41296cabe3d\") " pod="kube-system/konnectivity-agent-4z55l" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778105 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6eb6689c-0d36-4771-bcac-8118455cada4-cni-binary-copy\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778134 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6eb6689c-0d36-4771-bcac-8118455cada4-multus-daemon-config\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778146 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-run-multus-certs\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778153 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-systemd\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778176 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b042621c-fbf8-4739-b758-0e481535b940-cni-binary-copy\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778199 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/18767c55-78c8-48a0-ae7f-c2c09ebac544-host-slash\") pod \"iptables-alerter-xl7sg\" (UID: \"18767c55-78c8-48a0-ae7f-c2c09ebac544\") " pod="openshift-network-operator/iptables-alerter-xl7sg" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778207 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-run-k8s-cni-cncf-io\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778224 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-systemd-units\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.777497 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-sys-fs\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778249 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-cni-netd\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778264 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-os-release\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778274 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-device-dir\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.779853 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778299 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21b5c294-9caa-41ea-8bd0-357a8981ec9b-ovnkube-script-lib\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778307 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-sys\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778361 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-socket-dir\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778386 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-system-cni-dir\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778390 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-slash\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778395 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-etc-kubernetes\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778429 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b042621c-fbf8-4739-b758-0e481535b940-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778452 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-etc-kubernetes\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778458 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b042621c-fbf8-4739-b758-0e481535b940-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778471 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/236582ca-8db3-417e-b2cf-dc0053b8afcf-hosts-file\") pod \"node-resolver-5v2tv\" (UID: \"236582ca-8db3-417e-b2cf-dc0053b8afcf\") " pod="openshift-dns/node-resolver-5v2tv" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778489 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-node-log\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778519 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b042621c-fbf8-4739-b758-0e481535b940-system-cni-dir\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778552 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-run-openvswitch\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778594 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/34e79f54-de56-4c95-814b-f41296cabe3d-agent-certs\") pod \"konnectivity-agent-4z55l\" (UID: \"34e79f54-de56-4c95-814b-f41296cabe3d\") " pod="kube-system/konnectivity-agent-4z55l" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778619 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-sysctl-d\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778657 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-lib-modules\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778684 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4vlb\" (UniqueName: \"kubernetes.io/projected/cadd455a-a423-45c3-9621-616ec07f359b-kube-api-access-k4vlb\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.780540 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778710 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b042621c-fbf8-4739-b758-0e481535b940-cnibin\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778784 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-cni-bin\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778835 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-systemd-units\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.778938 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c6a74de4-3b80-4092-8de3-2a795216aa48-serviceca\") pod \"node-ca-4tp7h\" (UID: \"c6a74de4-3b80-4092-8de3-2a795216aa48\") " pod="openshift-image-registry/node-ca-4tp7h" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779050 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-sys\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779052 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-run-netns\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779090 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-host-cni-netd\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779131 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-device-dir\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779157 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-modprobe-d\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779210 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-etc-selinux\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779217 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-run-openvswitch\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779245 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-etc-openvswitch\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779289 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6eb6689c-0d36-4771-bcac-8118455cada4-cni-binary-copy\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779328 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cadd455a-a423-45c3-9621-616ec07f359b-socket-dir\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779382 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21b5c294-9caa-41ea-8bd0-357a8981ec9b-node-log\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779437 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6eb6689c-0d36-4771-bcac-8118455cada4-host-run-multus-certs\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779438 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-lib-modules\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779474 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6a74de4-3b80-4092-8de3-2a795216aa48-host\") pod \"node-ca-4tp7h\" (UID: \"c6a74de4-3b80-4092-8de3-2a795216aa48\") " pod="openshift-image-registry/node-ca-4tp7h" Apr 17 07:52:21.781033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779509 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/02603763-8df0-4492-8876-283310358655-etc-sysctl-d\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.781803 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.779944 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/18767c55-78c8-48a0-ae7f-c2c09ebac544-iptables-alerter-script\") pod \"iptables-alerter-xl7sg\" (UID: \"18767c55-78c8-48a0-ae7f-c2c09ebac544\") " pod="openshift-network-operator/iptables-alerter-xl7sg" Apr 17 07:52:21.781803 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.780168 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/02603763-8df0-4492-8876-283310358655-tmp\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.781803 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.780354 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21b5c294-9caa-41ea-8bd0-357a8981ec9b-ovn-node-metrics-cert\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.781803 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.780569 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/02603763-8df0-4492-8876-283310358655-etc-tuned\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.781803 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.781586 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/34e79f54-de56-4c95-814b-f41296cabe3d-agent-certs\") pod \"konnectivity-agent-4z55l\" (UID: \"34e79f54-de56-4c95-814b-f41296cabe3d\") " pod="kube-system/konnectivity-agent-4z55l" Apr 17 07:52:21.785368 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:21.785351 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:21.785449 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:21.785374 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:21.785449 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:21.785386 2570 projected.go:194] Error preparing data for projected volume kube-api-access-hcfw4 for pod openshift-network-diagnostics/network-check-target-dh54x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:21.785535 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:21.785450 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4 podName:7fe9b09c-633f-48ec-916f-04b365b73fcb nodeName:}" failed. No retries permitted until 2026-04-17 07:52:22.285431018 +0000 UTC m=+2.094441750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hcfw4" (UniqueName: "kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4") pod "network-check-target-dh54x" (UID: "7fe9b09c-633f-48ec-916f-04b365b73fcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:21.788268 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.788154 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g669x\" (UniqueName: \"kubernetes.io/projected/b124ed47-021a-4bde-8c03-dcfce0f301d8-kube-api-access-g669x\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:21.788388 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.788365 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4vlb\" (UniqueName: \"kubernetes.io/projected/cadd455a-a423-45c3-9621-616ec07f359b-kube-api-access-k4vlb\") pod \"aws-ebs-csi-driver-node-4pgsx\" (UID: \"cadd455a-a423-45c3-9621-616ec07f359b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.789052 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.789028 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2sqc\" (UniqueName: \"kubernetes.io/projected/236582ca-8db3-417e-b2cf-dc0053b8afcf-kube-api-access-r2sqc\") pod \"node-resolver-5v2tv\" (UID: \"236582ca-8db3-417e-b2cf-dc0053b8afcf\") " pod="openshift-dns/node-resolver-5v2tv" Apr 17 07:52:21.790338 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.790274 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbqvb\" (UniqueName: \"kubernetes.io/projected/c6a74de4-3b80-4092-8de3-2a795216aa48-kube-api-access-lbqvb\") pod \"node-ca-4tp7h\" (UID: \"c6a74de4-3b80-4092-8de3-2a795216aa48\") " pod="openshift-image-registry/node-ca-4tp7h" Apr 17 07:52:21.790646 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.790625 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdblz\" (UniqueName: \"kubernetes.io/projected/21b5c294-9caa-41ea-8bd0-357a8981ec9b-kube-api-access-cdblz\") pod \"ovnkube-node-x6ppg\" (UID: \"21b5c294-9caa-41ea-8bd0-357a8981ec9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:21.791081 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.791059 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpnrx\" (UniqueName: \"kubernetes.io/projected/02603763-8df0-4492-8876-283310358655-kube-api-access-lpnrx\") pod \"tuned-wh74p\" (UID: \"02603763-8df0-4492-8876-283310358655\") " pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:21.791560 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.791543 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb8jk\" (UniqueName: \"kubernetes.io/projected/6eb6689c-0d36-4771-bcac-8118455cada4-kube-api-access-fb8jk\") pod \"multus-5r6l7\" (UID: \"6eb6689c-0d36-4771-bcac-8118455cada4\") " pod="openshift-multus/multus-5r6l7" Apr 17 07:52:21.792989 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.792969 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwpn5\" (UniqueName: \"kubernetes.io/projected/18767c55-78c8-48a0-ae7f-c2c09ebac544-kube-api-access-gwpn5\") pod \"iptables-alerter-xl7sg\" (UID: \"18767c55-78c8-48a0-ae7f-c2c09ebac544\") " pod="openshift-network-operator/iptables-alerter-xl7sg" Apr 17 07:52:21.804866 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.804845 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5v2tv" Apr 17 07:52:21.827927 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.827886 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4tp7h" Apr 17 07:52:21.846534 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:21.846503 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod169311e9e20ae28d37e18c3438cd6482.slice/crio-14e7fe072212bc411e82e01762a8196b6d5e77c7257dec5d5caca09f51d2d1f1 WatchSource:0}: Error finding container 14e7fe072212bc411e82e01762a8196b6d5e77c7257dec5d5caca09f51d2d1f1: Status 404 returned error can't find the container with id 14e7fe072212bc411e82e01762a8196b6d5e77c7257dec5d5caca09f51d2d1f1 Apr 17 07:52:21.846944 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:21.846929 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1986491dc9fe18f2259782ab504301ca.slice/crio-32770b4aa7a01bd3ad530fae48d984c9f382d1cab5c6a0ef148259e675085c85 WatchSource:0}: Error finding container 32770b4aa7a01bd3ad530fae48d984c9f382d1cab5c6a0ef148259e675085c85: Status 404 returned error can't find the container with id 32770b4aa7a01bd3ad530fae48d984c9f382d1cab5c6a0ef148259e675085c85 Apr 17 07:52:21.847809 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:21.847787 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6a74de4_3b80_4092_8de3_2a795216aa48.slice/crio-6be977cb5c949ab72578eeec31f75f66c3982db8a54d45635979e62787a2f329 WatchSource:0}: Error finding container 6be977cb5c949ab72578eeec31f75f66c3982db8a54d45635979e62787a2f329: Status 404 returned error can't find the container with id 6be977cb5c949ab72578eeec31f75f66c3982db8a54d45635979e62787a2f329 Apr 17 07:52:21.852178 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.852165 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:52:21.879101 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.879077 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b042621c-fbf8-4739-b758-0e481535b940-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.879187 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.879109 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b042621c-fbf8-4739-b758-0e481535b940-os-release\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.879779 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.879748 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtzfs\" (UniqueName: \"kubernetes.io/projected/b042621c-fbf8-4739-b758-0e481535b940-kube-api-access-rtzfs\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.879779 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.879764 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b042621c-fbf8-4739-b758-0e481535b940-os-release\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.879996 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.879972 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b042621c-fbf8-4739-b758-0e481535b940-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.880186 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.880162 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b042621c-fbf8-4739-b758-0e481535b940-cni-binary-copy\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.880250 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.880219 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b042621c-fbf8-4739-b758-0e481535b940-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.880297 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.880254 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b042621c-fbf8-4739-b758-0e481535b940-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.880355 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.880293 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b042621c-fbf8-4739-b758-0e481535b940-system-cni-dir\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.880355 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.880343 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b042621c-fbf8-4739-b758-0e481535b940-cnibin\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.880473 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.880457 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b042621c-fbf8-4739-b758-0e481535b940-cnibin\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.880632 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.880617 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b042621c-fbf8-4739-b758-0e481535b940-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.880991 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.880970 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b042621c-fbf8-4739-b758-0e481535b940-cni-binary-copy\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.881061 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.881044 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b042621c-fbf8-4739-b758-0e481535b940-system-cni-dir\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.881104 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.881057 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b042621c-fbf8-4739-b758-0e481535b940-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.888021 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.888000 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtzfs\" (UniqueName: \"kubernetes.io/projected/b042621c-fbf8-4739-b758-0e481535b940-kube-api-access-rtzfs\") pod \"multus-additional-cni-plugins-jgncd\" (UID: \"b042621c-fbf8-4739-b758-0e481535b940\") " pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:21.989115 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:21.989087 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" Apr 17 07:52:21.995222 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:21.995202 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcadd455a_a423_45c3_9621_616ec07f359b.slice/crio-ee319958abc5ac3cf13153f6a91db8c01f48ed04dbf7d06e317f36251d795c04 WatchSource:0}: Error finding container ee319958abc5ac3cf13153f6a91db8c01f48ed04dbf7d06e317f36251d795c04: Status 404 returned error can't find the container with id ee319958abc5ac3cf13153f6a91db8c01f48ed04dbf7d06e317f36251d795c04 Apr 17 07:52:22.003125 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.003110 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5r6l7" Apr 17 07:52:22.009128 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:22.009107 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eb6689c_0d36_4771_bcac_8118455cada4.slice/crio-1f85d3e7b67b513314e0bb6bed30c4794e693930eb5b36eef8ef42feba66037e WatchSource:0}: Error finding container 1f85d3e7b67b513314e0bb6bed30c4794e693930eb5b36eef8ef42feba66037e: Status 404 returned error can't find the container with id 1f85d3e7b67b513314e0bb6bed30c4794e693930eb5b36eef8ef42feba66037e Apr 17 07:52:22.020136 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.020113 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xl7sg" Apr 17 07:52:22.026961 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:22.026940 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18767c55_78c8_48a0_ae7f_c2c09ebac544.slice/crio-d1e9654fc4fe27cafde50ba501cbd7002be2d3c7f64738a33496e79ec94bbcc9 WatchSource:0}: Error finding container d1e9654fc4fe27cafde50ba501cbd7002be2d3c7f64738a33496e79ec94bbcc9: Status 404 returned error can't find the container with id d1e9654fc4fe27cafde50ba501cbd7002be2d3c7f64738a33496e79ec94bbcc9 Apr 17 07:52:22.040794 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.040778 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:22.046535 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:22.046515 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21b5c294_9caa_41ea_8bd0_357a8981ec9b.slice/crio-9991919b250ce4905e9322494b801470a8cf3bc90936b92be6838021747274c6 WatchSource:0}: Error finding container 9991919b250ce4905e9322494b801470a8cf3bc90936b92be6838021747274c6: Status 404 returned error can't find the container with id 9991919b250ce4905e9322494b801470a8cf3bc90936b92be6838021747274c6 Apr 17 07:52:22.061862 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.061839 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4z55l" Apr 17 07:52:22.067785 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:22.067759 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34e79f54_de56_4c95_814b_f41296cabe3d.slice/crio-5c4e62f24731c264fb8fb2aba486e77e29721b5dfc912daae0cac156ec29f534 WatchSource:0}: Error finding container 5c4e62f24731c264fb8fb2aba486e77e29721b5dfc912daae0cac156ec29f534: Status 404 returned error can't find the container with id 5c4e62f24731c264fb8fb2aba486e77e29721b5dfc912daae0cac156ec29f534 Apr 17 07:52:22.078181 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.078141 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wh74p" Apr 17 07:52:22.089661 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:22.089643 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02603763_8df0_4492_8876_283310358655.slice/crio-4f48a5dbe761b2ebe3060a880562709db8e4ea0292953d3ee9dc230a2c199d27 WatchSource:0}: Error finding container 4f48a5dbe761b2ebe3060a880562709db8e4ea0292953d3ee9dc230a2c199d27: Status 404 returned error can't find the container with id 4f48a5dbe761b2ebe3060a880562709db8e4ea0292953d3ee9dc230a2c199d27 Apr 17 07:52:22.098541 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:22.098524 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod236582ca_8db3_417e_b2cf_dc0053b8afcf.slice/crio-8dac8973d33397862c630adfc725b4c9765920e2bec70fe77ec115aebfce1a8b WatchSource:0}: Error finding container 8dac8973d33397862c630adfc725b4c9765920e2bec70fe77ec115aebfce1a8b: Status 404 returned error can't find the container with id 8dac8973d33397862c630adfc725b4c9765920e2bec70fe77ec115aebfce1a8b Apr 17 07:52:22.133701 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.133683 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jgncd" Apr 17 07:52:22.139057 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:22.139035 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb042621c_fbf8_4739_b758_0e481535b940.slice/crio-51f7b12cd9d1c72a6e186aebed775c0555f78dfdc0b41e2e53f1e9ca593b8229 WatchSource:0}: Error finding container 51f7b12cd9d1c72a6e186aebed775c0555f78dfdc0b41e2e53f1e9ca593b8229: Status 404 returned error can't find the container with id 51f7b12cd9d1c72a6e186aebed775c0555f78dfdc0b41e2e53f1e9ca593b8229 Apr 17 07:52:22.282972 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.282942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:22.283114 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:22.283050 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:22.283114 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:22.283103 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs podName:b124ed47-021a-4bde-8c03-dcfce0f301d8 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:23.28308911 +0000 UTC m=+3.092099818 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs") pod "network-metrics-daemon-7bs5q" (UID: "b124ed47-021a-4bde-8c03-dcfce0f301d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:22.295908 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.295880 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-x9bnq"] Apr 17 07:52:22.299868 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.299852 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:22.299951 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:22.299909 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:22.383671 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.383625 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/746ed911-6342-422a-910a-f742c65c2879-kubelet-config\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:22.383845 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.383693 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:22.383845 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.383757 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/746ed911-6342-422a-910a-f742c65c2879-dbus\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:22.383845 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.383814 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcfw4\" (UniqueName: \"kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4\") pod \"network-check-target-dh54x\" (UID: \"7fe9b09c-633f-48ec-916f-04b365b73fcb\") " pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:22.384002 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:22.383942 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:22.384002 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:22.383959 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:22.384002 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:22.383972 2570 projected.go:194] Error preparing data for projected volume kube-api-access-hcfw4 for pod openshift-network-diagnostics/network-check-target-dh54x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:22.384152 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:22.384026 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4 podName:7fe9b09c-633f-48ec-916f-04b365b73fcb nodeName:}" failed. No retries permitted until 2026-04-17 07:52:23.384008589 +0000 UTC m=+3.193019312 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hcfw4" (UniqueName: "kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4") pod "network-check-target-dh54x" (UID: "7fe9b09c-633f-48ec-916f-04b365b73fcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:22.410238 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.410210 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:22.487719 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.487685 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/746ed911-6342-422a-910a-f742c65c2879-kubelet-config\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:22.487891 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.487743 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:22.487891 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.487789 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/746ed911-6342-422a-910a-f742c65c2879-dbus\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:22.488002 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.487978 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/746ed911-6342-422a-910a-f742c65c2879-dbus\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:22.488052 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.488035 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/746ed911-6342-422a-910a-f742c65c2879-kubelet-config\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:22.488139 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:22.488124 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:22.488207 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:22.488182 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret podName:746ed911-6342-422a-910a-f742c65c2879 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:22.988164396 +0000 UTC m=+2.797175109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret") pod "global-pull-secret-syncer-x9bnq" (UID: "746ed911-6342-422a-910a-f742c65c2879") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:22.584228 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.583906 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:22.708270 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.708108 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:47:21 +0000 UTC" deadline="2027-11-28 15:52:33.389999235 +0000 UTC" Apr 17 07:52:22.708270 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.708140 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14168h0m10.681862706s" Apr 17 07:52:22.726716 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.726388 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 07:52:22.824614 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.824587 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:22.824790 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.824625 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:22.824790 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:22.824713 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:22.824899 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:22.824848 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:22.838040 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.837989 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5v2tv" event={"ID":"236582ca-8db3-417e-b2cf-dc0053b8afcf","Type":"ContainerStarted","Data":"8dac8973d33397862c630adfc725b4c9765920e2bec70fe77ec115aebfce1a8b"} Apr 17 07:52:22.850593 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.850561 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wh74p" event={"ID":"02603763-8df0-4492-8876-283310358655","Type":"ContainerStarted","Data":"4f48a5dbe761b2ebe3060a880562709db8e4ea0292953d3ee9dc230a2c199d27"} Apr 17 07:52:22.869123 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.869080 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4z55l" event={"ID":"34e79f54-de56-4c95-814b-f41296cabe3d","Type":"ContainerStarted","Data":"5c4e62f24731c264fb8fb2aba486e77e29721b5dfc912daae0cac156ec29f534"} Apr 17 07:52:22.871958 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.871930 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal" event={"ID":"1986491dc9fe18f2259782ab504301ca","Type":"ContainerStarted","Data":"32770b4aa7a01bd3ad530fae48d984c9f382d1cab5c6a0ef148259e675085c85"} Apr 17 07:52:22.884461 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.884433 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-233.ec2.internal" event={"ID":"169311e9e20ae28d37e18c3438cd6482","Type":"ContainerStarted","Data":"14e7fe072212bc411e82e01762a8196b6d5e77c7257dec5d5caca09f51d2d1f1"} Apr 17 07:52:22.891525 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.891499 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgncd" event={"ID":"b042621c-fbf8-4739-b758-0e481535b940","Type":"ContainerStarted","Data":"51f7b12cd9d1c72a6e186aebed775c0555f78dfdc0b41e2e53f1e9ca593b8229"} Apr 17 07:52:22.927719 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.927685 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" event={"ID":"21b5c294-9caa-41ea-8bd0-357a8981ec9b","Type":"ContainerStarted","Data":"9991919b250ce4905e9322494b801470a8cf3bc90936b92be6838021747274c6"} Apr 17 07:52:22.939611 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.939580 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xl7sg" event={"ID":"18767c55-78c8-48a0-ae7f-c2c09ebac544","Type":"ContainerStarted","Data":"d1e9654fc4fe27cafde50ba501cbd7002be2d3c7f64738a33496e79ec94bbcc9"} Apr 17 07:52:22.946684 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.946654 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5r6l7" event={"ID":"6eb6689c-0d36-4771-bcac-8118455cada4","Type":"ContainerStarted","Data":"1f85d3e7b67b513314e0bb6bed30c4794e693930eb5b36eef8ef42feba66037e"} Apr 17 07:52:22.954040 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.954014 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" event={"ID":"cadd455a-a423-45c3-9621-616ec07f359b","Type":"ContainerStarted","Data":"ee319958abc5ac3cf13153f6a91db8c01f48ed04dbf7d06e317f36251d795c04"} Apr 17 07:52:22.970488 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.970429 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4tp7h" event={"ID":"c6a74de4-3b80-4092-8de3-2a795216aa48","Type":"ContainerStarted","Data":"6be977cb5c949ab72578eeec31f75f66c3982db8a54d45635979e62787a2f329"} Apr 17 07:52:22.991959 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:22.991922 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:22.992174 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:22.992149 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:22.992247 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:22.992217 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret podName:746ed911-6342-422a-910a-f742c65c2879 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:23.9921982 +0000 UTC m=+3.801208922 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret") pod "global-pull-secret-syncer-x9bnq" (UID: "746ed911-6342-422a-910a-f742c65c2879") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:23.295531 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:23.295455 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:23.295710 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:23.295632 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:23.295710 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:23.295700 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs podName:b124ed47-021a-4bde-8c03-dcfce0f301d8 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:25.295682503 +0000 UTC m=+5.104693216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs") pod "network-metrics-daemon-7bs5q" (UID: "b124ed47-021a-4bde-8c03-dcfce0f301d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:23.396716 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:23.396664 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcfw4\" (UniqueName: \"kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4\") pod \"network-check-target-dh54x\" (UID: \"7fe9b09c-633f-48ec-916f-04b365b73fcb\") " pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:23.396911 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:23.396895 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:23.396992 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:23.396917 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:23.396992 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:23.396930 2570 projected.go:194] Error preparing data for projected volume kube-api-access-hcfw4 for pod openshift-network-diagnostics/network-check-target-dh54x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:23.397113 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:23.397002 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4 podName:7fe9b09c-633f-48ec-916f-04b365b73fcb nodeName:}" failed. No retries permitted until 2026-04-17 07:52:25.396982864 +0000 UTC m=+5.205993577 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hcfw4" (UniqueName: "kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4") pod "network-check-target-dh54x" (UID: "7fe9b09c-633f-48ec-916f-04b365b73fcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:23.709000 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:23.708727 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:47:21 +0000 UTC" deadline="2027-10-23 02:28:49.812209226 +0000 UTC" Apr 17 07:52:23.709000 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:23.708968 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13290h36m26.103245204s" Apr 17 07:52:23.824428 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:23.824399 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:23.824601 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:23.824531 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:24.002992 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:24.001373 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:24.002992 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:24.001564 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:24.002992 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:24.001631 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret podName:746ed911-6342-422a-910a-f742c65c2879 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:26.001607815 +0000 UTC m=+5.810618546 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret") pod "global-pull-secret-syncer-x9bnq" (UID: "746ed911-6342-422a-910a-f742c65c2879") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:24.825521 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:24.825486 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:24.825945 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:24.825484 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:24.825945 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:24.825634 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:24.825945 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:24.825747 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:25.321298 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:25.321261 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:25.321494 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:25.321436 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:25.321569 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:25.321513 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs podName:b124ed47-021a-4bde-8c03-dcfce0f301d8 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:29.321493803 +0000 UTC m=+9.130504511 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs") pod "network-metrics-daemon-7bs5q" (UID: "b124ed47-021a-4bde-8c03-dcfce0f301d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:25.423387 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:25.422783 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcfw4\" (UniqueName: \"kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4\") pod \"network-check-target-dh54x\" (UID: \"7fe9b09c-633f-48ec-916f-04b365b73fcb\") " pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:25.423387 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:25.422920 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:25.423387 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:25.422942 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:25.423387 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:25.422955 2570 projected.go:194] Error preparing data for projected volume kube-api-access-hcfw4 for pod openshift-network-diagnostics/network-check-target-dh54x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:25.423387 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:25.423022 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4 podName:7fe9b09c-633f-48ec-916f-04b365b73fcb nodeName:}" failed. No retries permitted until 2026-04-17 07:52:29.42299602 +0000 UTC m=+9.232006748 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hcfw4" (UniqueName: "kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4") pod "network-check-target-dh54x" (UID: "7fe9b09c-633f-48ec-916f-04b365b73fcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:25.824835 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:25.824806 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:25.824998 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:25.824913 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:26.028737 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:26.028695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:26.029209 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:26.028848 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:26.029209 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:26.028910 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret podName:746ed911-6342-422a-910a-f742c65c2879 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:30.028892022 +0000 UTC m=+9.837902735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret") pod "global-pull-secret-syncer-x9bnq" (UID: "746ed911-6342-422a-910a-f742c65c2879") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:26.824611 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:26.824577 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:26.824785 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:26.824696 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:26.824876 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:26.824853 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:26.825019 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:26.824978 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:27.825286 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:27.825254 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:27.825756 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:27.825409 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:28.825660 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:28.825595 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:28.825660 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:28.825622 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:28.826154 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:28.825735 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:28.826154 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:28.825932 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:29.357201 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:29.357162 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:29.357409 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:29.357332 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:29.357501 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:29.357421 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs podName:b124ed47-021a-4bde-8c03-dcfce0f301d8 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:37.357398849 +0000 UTC m=+17.166409575 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs") pod "network-metrics-daemon-7bs5q" (UID: "b124ed47-021a-4bde-8c03-dcfce0f301d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:29.458045 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:29.457832 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcfw4\" (UniqueName: \"kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4\") pod \"network-check-target-dh54x\" (UID: \"7fe9b09c-633f-48ec-916f-04b365b73fcb\") " pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:29.458045 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:29.458035 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:29.458045 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:29.458054 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:29.458345 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:29.458066 2570 projected.go:194] Error preparing data for projected volume kube-api-access-hcfw4 for pod openshift-network-diagnostics/network-check-target-dh54x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:29.458345 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:29.458124 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4 podName:7fe9b09c-633f-48ec-916f-04b365b73fcb nodeName:}" failed. No retries permitted until 2026-04-17 07:52:37.458106613 +0000 UTC m=+17.267117321 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hcfw4" (UniqueName: "kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4") pod "network-check-target-dh54x" (UID: "7fe9b09c-633f-48ec-916f-04b365b73fcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:29.825095 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:29.825011 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:29.825287 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:29.825144 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:30.063989 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:30.063947 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:30.064451 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:30.064150 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:30.064451 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:30.064215 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret podName:746ed911-6342-422a-910a-f742c65c2879 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:38.064197634 +0000 UTC m=+17.873208365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret") pod "global-pull-secret-syncer-x9bnq" (UID: "746ed911-6342-422a-910a-f742c65c2879") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:30.825741 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:30.825657 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:30.825906 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:30.825768 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:30.825906 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:30.825822 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:30.825906 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:30.825882 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:31.825006 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:31.824975 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:31.825460 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:31.825109 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:32.825230 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:32.825188 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:32.825664 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:32.825245 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:32.825664 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:32.825350 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:32.825664 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:32.825455 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:33.824942 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:33.824901 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:33.825117 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:33.825022 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:34.825262 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:34.825203 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:34.825262 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:34.825230 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:34.825812 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:34.825359 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:34.825812 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:34.825483 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:35.825370 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:35.825343 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:35.825667 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:35.825463 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:36.825326 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:36.825273 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:36.825326 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:36.825273 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:36.825848 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:36.825432 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:36.825848 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:36.825573 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:37.420612 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:37.420582 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:37.420857 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:37.420685 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:37.420857 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:37.420751 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs podName:b124ed47-021a-4bde-8c03-dcfce0f301d8 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:53.420731944 +0000 UTC m=+33.229742674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs") pod "network-metrics-daemon-7bs5q" (UID: "b124ed47-021a-4bde-8c03-dcfce0f301d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:37.520892 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:37.520856 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcfw4\" (UniqueName: \"kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4\") pod \"network-check-target-dh54x\" (UID: \"7fe9b09c-633f-48ec-916f-04b365b73fcb\") " pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:37.521075 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:37.521015 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:37.521075 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:37.521036 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:37.521075 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:37.521045 2570 projected.go:194] Error preparing data for projected volume kube-api-access-hcfw4 for pod openshift-network-diagnostics/network-check-target-dh54x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:37.521196 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:37.521099 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4 podName:7fe9b09c-633f-48ec-916f-04b365b73fcb nodeName:}" failed. No retries permitted until 2026-04-17 07:52:53.521080617 +0000 UTC m=+33.330091328 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hcfw4" (UniqueName: "kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4") pod "network-check-target-dh54x" (UID: "7fe9b09c-633f-48ec-916f-04b365b73fcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:37.825098 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:37.825026 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:37.825235 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:37.825132 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:38.124539 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:38.124511 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:38.124967 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:38.124638 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:38.124967 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:38.124704 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret podName:746ed911-6342-422a-910a-f742c65c2879 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:54.124685981 +0000 UTC m=+33.933696689 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret") pod "global-pull-secret-syncer-x9bnq" (UID: "746ed911-6342-422a-910a-f742c65c2879") : object "kube-system"/"original-pull-secret" not registered Apr 17 07:52:38.825214 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:38.825177 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:38.825397 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:38.825189 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:38.825397 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:38.825306 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:38.825578 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:38.825397 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:39.824628 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:39.824593 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:39.825019 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:39.824708 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:40.826184 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:40.825969 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:40.826702 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:40.826049 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:40.826702 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:40.826372 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:40.826702 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:40.826485 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:41.004948 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:41.004918 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wh74p" event={"ID":"02603763-8df0-4492-8876-283310358655","Type":"ContainerStarted","Data":"275debed85cd8b67c0d55c9257c9d884d1d51577fdf77a7f1795991ca7cdf7d5"} Apr 17 07:52:41.007951 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:41.007672 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-233.ec2.internal" event={"ID":"169311e9e20ae28d37e18c3438cd6482","Type":"ContainerStarted","Data":"aa26e24027be2c0c4e9a063cb5d0822130b79b20deb220b1581dc1c1c117486f"} Apr 17 07:52:41.010923 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:41.010896 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" event={"ID":"21b5c294-9caa-41ea-8bd0-357a8981ec9b","Type":"ContainerStarted","Data":"3ff0f6cc71873daca1ee056e2e5518e12aae1da3c1cb10d0649e8b64649bac0d"} Apr 17 07:52:41.011015 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:41.010932 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" event={"ID":"21b5c294-9caa-41ea-8bd0-357a8981ec9b","Type":"ContainerStarted","Data":"c5b20e9de9dc62bd02cfa3a77e6283ff2902dca5d2911cd45a71a78b1e00a238"} Apr 17 07:52:41.011015 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:41.010942 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" event={"ID":"21b5c294-9caa-41ea-8bd0-357a8981ec9b","Type":"ContainerStarted","Data":"c7452e17abcb71241350d326017eb44568f1d5220ff15ab2e777ad796151be0a"} Apr 17 07:52:41.011015 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:41.010952 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" event={"ID":"21b5c294-9caa-41ea-8bd0-357a8981ec9b","Type":"ContainerStarted","Data":"082e6c2f1188955fbe570fca848602e05081fbcebe75ffa8a72f8be9260f1ace"} Apr 17 07:52:41.012444 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:41.012407 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5r6l7" event={"ID":"6eb6689c-0d36-4771-bcac-8118455cada4","Type":"ContainerStarted","Data":"1db92f0e0bfde08e339c9f79142d9e7f5311020f20fb604122a56b9128d6372a"} Apr 17 07:52:41.022244 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:41.022204 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wh74p" podStartSLOduration=2.989788033 podStartE2EDuration="21.02219187s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="2026-04-17 07:52:22.093360112 +0000 UTC m=+1.902370821" lastFinishedPulling="2026-04-17 07:52:40.125763936 +0000 UTC m=+19.934774658" observedRunningTime="2026-04-17 07:52:41.022023238 +0000 UTC m=+20.831033992" watchObservedRunningTime="2026-04-17 07:52:41.02219187 +0000 UTC m=+20.831202600" Apr 17 07:52:41.035589 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:41.035494 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-233.ec2.internal" podStartSLOduration=21.035479128 podStartE2EDuration="21.035479128s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:52:41.035455036 +0000 UTC m=+20.844465810" watchObservedRunningTime="2026-04-17 07:52:41.035479128 +0000 UTC m=+20.844489859" Apr 17 07:52:41.051213 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:41.051171 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5r6l7" podStartSLOduration=2.906608782 podStartE2EDuration="21.05115771s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="2026-04-17 07:52:22.010996594 +0000 UTC m=+1.820007303" lastFinishedPulling="2026-04-17 07:52:40.15554551 +0000 UTC m=+19.964556231" observedRunningTime="2026-04-17 07:52:41.050586664 +0000 UTC m=+20.859597395" watchObservedRunningTime="2026-04-17 07:52:41.05115771 +0000 UTC m=+20.860168440" Apr 17 07:52:41.825008 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:41.824830 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:41.825158 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:41.825111 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:42.014922 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.014846 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5v2tv" event={"ID":"236582ca-8db3-417e-b2cf-dc0053b8afcf","Type":"ContainerStarted","Data":"3a0d1f4ea29cf06926628de69303c578e5fa2ba33eb06456eba439e0ac2dc9cb"} Apr 17 07:52:42.016090 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.016060 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4z55l" event={"ID":"34e79f54-de56-4c95-814b-f41296cabe3d","Type":"ContainerStarted","Data":"3e59294ee7fe89ca8139e230acab1e1bc18f67b8b3b19788e984877ba0ba59be"} Apr 17 07:52:42.017326 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.017284 2570 generic.go:358] "Generic (PLEG): container finished" podID="1986491dc9fe18f2259782ab504301ca" containerID="bc07c6f13f1dc3dd9dd11e35079d4d1aa54847524e63fb1ecdc4d5bf558b0222" exitCode=0 Apr 17 07:52:42.017407 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.017378 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal" event={"ID":"1986491dc9fe18f2259782ab504301ca","Type":"ContainerDied","Data":"bc07c6f13f1dc3dd9dd11e35079d4d1aa54847524e63fb1ecdc4d5bf558b0222"} Apr 17 07:52:42.018690 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.018662 2570 generic.go:358] "Generic (PLEG): container finished" podID="b042621c-fbf8-4739-b758-0e481535b940" containerID="e1bf8e4b25c8b2ded48eff3c83f104856d40af37290275840d6a94e179de1aa2" exitCode=0 Apr 17 07:52:42.018780 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.018724 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgncd" event={"ID":"b042621c-fbf8-4739-b758-0e481535b940","Type":"ContainerDied","Data":"e1bf8e4b25c8b2ded48eff3c83f104856d40af37290275840d6a94e179de1aa2"} Apr 17 07:52:42.021089 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.021068 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" event={"ID":"21b5c294-9caa-41ea-8bd0-357a8981ec9b","Type":"ContainerStarted","Data":"e823f304aa95f8a850d67d9676d76dcb50a9887f8536453c6527534ce0571b1d"} Apr 17 07:52:42.021166 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.021096 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" event={"ID":"21b5c294-9caa-41ea-8bd0-357a8981ec9b","Type":"ContainerStarted","Data":"512432fa89c0396fd9ea12e46b9234554e6e04cfacd47ec97714da8546f6de3b"} Apr 17 07:52:42.022173 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.022152 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xl7sg" event={"ID":"18767c55-78c8-48a0-ae7f-c2c09ebac544","Type":"ContainerStarted","Data":"cb900ab56655b1200a32260248dd1447ac60d5b28d8393cf56100c2f09942e62"} Apr 17 07:52:42.023358 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.023337 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" event={"ID":"cadd455a-a423-45c3-9621-616ec07f359b","Type":"ContainerStarted","Data":"0ce1ebd31657292982d4d7ffc0a1379eb6e85ceb6b92d15473706e5afbb49abc"} Apr 17 07:52:42.024363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.024344 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4tp7h" event={"ID":"c6a74de4-3b80-4092-8de3-2a795216aa48","Type":"ContainerStarted","Data":"05843577a29a161c2fc97f3cd99a8433796c508a81123c71ae9c3b4bad9b22b5"} Apr 17 07:52:42.028137 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.028104 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5v2tv" podStartSLOduration=4.005189455 podStartE2EDuration="22.028094813s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="2026-04-17 07:52:22.099821581 +0000 UTC m=+1.908832292" lastFinishedPulling="2026-04-17 07:52:40.12272693 +0000 UTC m=+19.931737650" observedRunningTime="2026-04-17 07:52:42.027805066 +0000 UTC m=+21.836815796" watchObservedRunningTime="2026-04-17 07:52:42.028094813 +0000 UTC m=+21.837105542" Apr 17 07:52:42.073413 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.073375 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4z55l" podStartSLOduration=4.019667924 podStartE2EDuration="22.073360054s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="2026-04-17 07:52:22.069065506 +0000 UTC m=+1.878076213" lastFinishedPulling="2026-04-17 07:52:40.12275762 +0000 UTC m=+19.931768343" observedRunningTime="2026-04-17 07:52:42.073151439 +0000 UTC m=+21.882162168" watchObservedRunningTime="2026-04-17 07:52:42.073360054 +0000 UTC m=+21.882370785" Apr 17 07:52:42.088972 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.088925 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xl7sg" podStartSLOduration=3.994552373 podStartE2EDuration="22.088910556s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="2026-04-17 07:52:22.028382733 +0000 UTC m=+1.837393444" lastFinishedPulling="2026-04-17 07:52:40.122740903 +0000 UTC m=+19.931751627" observedRunningTime="2026-04-17 07:52:42.088684973 +0000 UTC m=+21.897695706" watchObservedRunningTime="2026-04-17 07:52:42.088910556 +0000 UTC m=+21.897921287" Apr 17 07:52:42.101171 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.100697 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4tp7h" podStartSLOduration=3.828720588 podStartE2EDuration="22.10068118s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="2026-04-17 07:52:21.852399661 +0000 UTC m=+1.661410368" lastFinishedPulling="2026-04-17 07:52:40.124360248 +0000 UTC m=+19.933370960" observedRunningTime="2026-04-17 07:52:42.100608023 +0000 UTC m=+21.909618754" watchObservedRunningTime="2026-04-17 07:52:42.10068118 +0000 UTC m=+21.909691910" Apr 17 07:52:42.182618 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.182581 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 07:52:42.748421 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.748281 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T07:52:42.182598915Z","UUID":"e15282ca-7e33-47b1-98e0-d84eff7e0748","Handler":null,"Name":"","Endpoint":""} Apr 17 07:52:42.750352 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.750293 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 07:52:42.750352 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.750344 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 07:52:42.824902 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.824877 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:42.825053 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:42.825021 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:42.825053 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:42.825021 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:42.825222 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:42.825110 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:43.028555 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:43.028471 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" event={"ID":"cadd455a-a423-45c3-9621-616ec07f359b","Type":"ContainerStarted","Data":"d0a2710ea3700fa89483ffae7b530ff31f8d819ccf922361b5d0b60a45f7bd61"} Apr 17 07:52:43.030284 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:43.030244 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal" event={"ID":"1986491dc9fe18f2259782ab504301ca","Type":"ContainerStarted","Data":"28de496c46a34d31fed8116943865c3f04790e7210009b12e4b22329626e9e9a"} Apr 17 07:52:43.045572 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:43.045532 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-233.ec2.internal" podStartSLOduration=23.045518664 podStartE2EDuration="23.045518664s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:52:43.044906552 +0000 UTC m=+22.853917306" watchObservedRunningTime="2026-04-17 07:52:43.045518664 +0000 UTC m=+22.854529395" Apr 17 07:52:43.825581 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:43.825379 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:43.825768 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:43.825660 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:44.034402 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:44.034363 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" event={"ID":"cadd455a-a423-45c3-9621-616ec07f359b","Type":"ContainerStarted","Data":"ac7b11cc7e503344fc00c421b2194b52b23f8753e557e4cbd80618a595fd904b"} Apr 17 07:52:44.037677 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:44.037643 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" event={"ID":"21b5c294-9caa-41ea-8bd0-357a8981ec9b","Type":"ContainerStarted","Data":"1759f309380f2b010b503ad6f334d89b5a31989a70b36ddf212b42d2dfa5534a"} Apr 17 07:52:44.051340 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:44.051281 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4pgsx" podStartSLOduration=2.972968963 podStartE2EDuration="24.051267947s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="2026-04-17 07:52:21.996780046 +0000 UTC m=+1.805790754" lastFinishedPulling="2026-04-17 07:52:43.07507902 +0000 UTC m=+22.884089738" observedRunningTime="2026-04-17 07:52:44.050861467 +0000 UTC m=+23.859872201" watchObservedRunningTime="2026-04-17 07:52:44.051267947 +0000 UTC m=+23.860278678" Apr 17 07:52:44.825182 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:44.825150 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:44.825376 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:44.825269 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:44.825376 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:44.825354 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:44.825500 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:44.825476 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:45.797588 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:45.797550 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4z55l" Apr 17 07:52:45.798254 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:45.798138 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4z55l" Apr 17 07:52:45.824588 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:45.824553 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:45.824747 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:45.824644 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:46.045521 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:46.045408 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" event={"ID":"21b5c294-9caa-41ea-8bd0-357a8981ec9b","Type":"ContainerStarted","Data":"f77197f3f6ef9f5508c1070771b312884af052f775f6cc45c4e06aabc1f792b1"} Apr 17 07:52:46.045897 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:46.045871 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4z55l" Apr 17 07:52:46.045984 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:46.045917 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:46.045984 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:46.045929 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:46.046725 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:46.046608 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4z55l" Apr 17 07:52:46.062391 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:46.062249 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:46.072880 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:46.072827 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" podStartSLOduration=7.52126815 podStartE2EDuration="26.072812904s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="2026-04-17 07:52:22.047991862 +0000 UTC m=+1.857002572" lastFinishedPulling="2026-04-17 07:52:40.599536615 +0000 UTC m=+20.408547326" observedRunningTime="2026-04-17 07:52:46.072291469 +0000 UTC m=+25.881302239" watchObservedRunningTime="2026-04-17 07:52:46.072812904 +0000 UTC m=+25.881823634" Apr 17 07:52:46.824676 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:46.824640 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:46.825158 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:46.824685 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:46.825158 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:46.824786 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:46.825158 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:46.824881 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:47.048918 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:47.048883 2570 generic.go:358] "Generic (PLEG): container finished" podID="b042621c-fbf8-4739-b758-0e481535b940" containerID="87ad2f061ecf3292f5fe4ae27add45e86a54be22c313dd717c5a60b557cef7a4" exitCode=0 Apr 17 07:52:47.049078 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:47.048957 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgncd" event={"ID":"b042621c-fbf8-4739-b758-0e481535b940","Type":"ContainerDied","Data":"87ad2f061ecf3292f5fe4ae27add45e86a54be22c313dd717c5a60b557cef7a4"} Apr 17 07:52:47.050067 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:47.049751 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:47.064786 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:47.064768 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:52:47.825053 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:47.825031 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:47.825375 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:47.825135 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:48.052914 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:48.052882 2570 generic.go:358] "Generic (PLEG): container finished" podID="b042621c-fbf8-4739-b758-0e481535b940" containerID="c3b4538970a320b866b57533ba976f71d26c0ebe9d522a7cef5645de6f85d3e6" exitCode=0 Apr 17 07:52:48.053075 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:48.052975 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgncd" event={"ID":"b042621c-fbf8-4739-b758-0e481535b940","Type":"ContainerDied","Data":"c3b4538970a320b866b57533ba976f71d26c0ebe9d522a7cef5645de6f85d3e6"} Apr 17 07:52:48.174299 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:48.174271 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x9bnq"] Apr 17 07:52:48.174466 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:48.174374 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:48.174466 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:48.174452 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:48.177153 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:48.177130 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dh54x"] Apr 17 07:52:48.177266 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:48.177225 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:48.177304 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:48.177291 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:48.185833 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:48.185809 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7bs5q"] Apr 17 07:52:48.185958 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:48.185913 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:48.186019 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:48.185993 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:49.056933 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:49.056897 2570 generic.go:358] "Generic (PLEG): container finished" podID="b042621c-fbf8-4739-b758-0e481535b940" containerID="1be741094a22dee872116abc8d45ce1c9befb924c8a7e1ed62ac5328c708fc8f" exitCode=0 Apr 17 07:52:49.057385 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:49.056982 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgncd" event={"ID":"b042621c-fbf8-4739-b758-0e481535b940","Type":"ContainerDied","Data":"1be741094a22dee872116abc8d45ce1c9befb924c8a7e1ed62ac5328c708fc8f"} Apr 17 07:52:49.824604 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:49.824572 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:49.824786 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:49.824693 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:49.824786 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:49.824704 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:49.824908 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:49.824807 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:49.824908 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:49.824865 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:49.825018 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:49.824961 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:51.825351 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:51.825240 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:51.825351 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:51.825293 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:51.825351 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:51.825291 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:51.826045 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:51.825406 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dh54x" podUID="7fe9b09c-633f-48ec-916f-04b365b73fcb" Apr 17 07:52:51.826045 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:51.825552 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x9bnq" podUID="746ed911-6342-422a-910a-f742c65c2879" Apr 17 07:52:51.826045 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:51.825658 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:52:53.442205 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.442123 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:53.442715 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:53.442291 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:53.442715 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:53.442383 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs podName:b124ed47-021a-4bde-8c03-dcfce0f301d8 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:25.442362641 +0000 UTC m=+65.251373356 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs") pod "network-metrics-daemon-7bs5q" (UID: "b124ed47-021a-4bde-8c03-dcfce0f301d8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 07:52:53.542557 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.542521 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcfw4\" (UniqueName: \"kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4\") pod \"network-check-target-dh54x\" (UID: \"7fe9b09c-633f-48ec-916f-04b365b73fcb\") " pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:53.542758 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:53.542713 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 07:52:53.542758 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:53.542741 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 07:52:53.542758 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:53.542753 2570 projected.go:194] Error preparing data for projected volume kube-api-access-hcfw4 for pod openshift-network-diagnostics/network-check-target-dh54x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:53.542905 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:53.542816 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4 podName:7fe9b09c-633f-48ec-916f-04b365b73fcb nodeName:}" failed. No retries permitted until 2026-04-17 07:53:25.542802248 +0000 UTC m=+65.351812962 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-hcfw4" (UniqueName: "kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4") pod "network-check-target-dh54x" (UID: "7fe9b09c-633f-48ec-916f-04b365b73fcb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 07:52:53.556562 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.556534 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-233.ec2.internal" event="NodeReady" Apr 17 07:52:53.556711 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.556682 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 07:52:53.597728 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.597702 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vrvr9"] Apr 17 07:52:53.625862 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.625819 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qnv6r"] Apr 17 07:52:53.626159 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.626137 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:53.628518 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.628480 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 07:52:53.628623 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.628523 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-d4l2f\"" Apr 17 07:52:53.628687 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.628625 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 07:52:53.643574 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.643548 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vrvr9"] Apr 17 07:52:53.643574 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.643577 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qnv6r"] Apr 17 07:52:53.643750 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.643677 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:52:53.645704 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.645685 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 07:52:53.645805 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.645708 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 07:52:53.645962 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.645943 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 07:52:53.646064 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.646027 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fmxgb\"" Apr 17 07:52:53.744563 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.744470 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-config-volume\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:53.744563 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.744518 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:52:53.744563 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.744550 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-tmp-dir\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:53.744821 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.744630 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjnqz\" (UniqueName: \"kubernetes.io/projected/77b6c827-22ca-45dc-8ce4-0267c31539a0-kube-api-access-cjnqz\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:52:53.744821 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.744696 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:53.744821 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.744755 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph58v\" (UniqueName: \"kubernetes.io/projected/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-kube-api-access-ph58v\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:53.824869 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.824834 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:52:53.825037 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.824834 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:53.825037 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.824834 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:52:53.827625 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.827507 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:52:53.827625 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.827536 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:52:53.827625 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.827541 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jkb85\"" Apr 17 07:52:53.827625 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.827585 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:52:53.827901 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.827657 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 07:52:53.827901 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.827860 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-n45c2\"" Apr 17 07:52:53.845215 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.845191 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjnqz\" (UniqueName: \"kubernetes.io/projected/77b6c827-22ca-45dc-8ce4-0267c31539a0-kube-api-access-cjnqz\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:52:53.845369 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.845229 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:53.845369 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.845268 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ph58v\" (UniqueName: \"kubernetes.io/projected/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-kube-api-access-ph58v\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:53.845369 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.845296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-config-volume\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:53.845369 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.845346 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:52:53.845635 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.845375 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-tmp-dir\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:53.845635 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:53.845413 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:53.845635 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:53.845510 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls podName:6e47d03e-fde2-4cd8-96cd-ec060cb7afb1 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:54.345489325 +0000 UTC m=+34.154500048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls") pod "dns-default-vrvr9" (UID: "6e47d03e-fde2-4cd8-96cd-ec060cb7afb1") : secret "dns-default-metrics-tls" not found Apr 17 07:52:53.845635 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:53.845532 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:53.845635 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:53.845577 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert podName:77b6c827-22ca-45dc-8ce4-0267c31539a0 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:54.3455661 +0000 UTC m=+34.154576807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert") pod "ingress-canary-qnv6r" (UID: "77b6c827-22ca-45dc-8ce4-0267c31539a0") : secret "canary-serving-cert" not found Apr 17 07:52:53.845893 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.845709 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-tmp-dir\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:53.845996 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.845974 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-config-volume\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:53.855324 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.855291 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph58v\" (UniqueName: \"kubernetes.io/projected/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-kube-api-access-ph58v\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:53.855439 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:53.855411 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjnqz\" (UniqueName: \"kubernetes.io/projected/77b6c827-22ca-45dc-8ce4-0267c31539a0-kube-api-access-cjnqz\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:52:54.147409 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:54.147366 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:54.149904 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:54.149881 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/746ed911-6342-422a-910a-f742c65c2879-original-pull-secret\") pod \"global-pull-secret-syncer-x9bnq\" (UID: \"746ed911-6342-422a-910a-f742c65c2879\") " pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:54.348374 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:54.348298 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:54.348582 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:54.348399 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:52:54.348582 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:54.348446 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:54.348582 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:54.348558 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls podName:6e47d03e-fde2-4cd8-96cd-ec060cb7afb1 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:55.34853804 +0000 UTC m=+35.157548763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls") pod "dns-default-vrvr9" (UID: "6e47d03e-fde2-4cd8-96cd-ec060cb7afb1") : secret "dns-default-metrics-tls" not found Apr 17 07:52:54.348582 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:54.348560 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:54.348793 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:54.348619 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert podName:77b6c827-22ca-45dc-8ce4-0267c31539a0 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:55.348602464 +0000 UTC m=+35.157613187 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert") pod "ingress-canary-qnv6r" (UID: "77b6c827-22ca-45dc-8ce4-0267c31539a0") : secret "canary-serving-cert" not found Apr 17 07:52:54.446130 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:54.446053 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x9bnq" Apr 17 07:52:54.754730 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:54.754559 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x9bnq"] Apr 17 07:52:54.758561 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:54.758532 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod746ed911_6342_422a_910a_f742c65c2879.slice/crio-e719e4d41e6082e48537bb6b3f6819bf068406feea83900db098f4be7d5ed1cd WatchSource:0}: Error finding container e719e4d41e6082e48537bb6b3f6819bf068406feea83900db098f4be7d5ed1cd: Status 404 returned error can't find the container with id e719e4d41e6082e48537bb6b3f6819bf068406feea83900db098f4be7d5ed1cd Apr 17 07:52:55.068519 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:55.068488 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x9bnq" event={"ID":"746ed911-6342-422a-910a-f742c65c2879","Type":"ContainerStarted","Data":"e719e4d41e6082e48537bb6b3f6819bf068406feea83900db098f4be7d5ed1cd"} Apr 17 07:52:55.355722 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:55.355689 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:52:55.355936 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:55.355757 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:55.355936 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:55.355866 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:55.355936 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:55.355921 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:55.356074 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:55.355940 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls podName:6e47d03e-fde2-4cd8-96cd-ec060cb7afb1 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:57.35592379 +0000 UTC m=+37.164934497 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls") pod "dns-default-vrvr9" (UID: "6e47d03e-fde2-4cd8-96cd-ec060cb7afb1") : secret "dns-default-metrics-tls" not found Apr 17 07:52:55.356074 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:55.355981 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert podName:77b6c827-22ca-45dc-8ce4-0267c31539a0 nodeName:}" failed. No retries permitted until 2026-04-17 07:52:57.355962195 +0000 UTC m=+37.164972907 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert") pod "ingress-canary-qnv6r" (UID: "77b6c827-22ca-45dc-8ce4-0267c31539a0") : secret "canary-serving-cert" not found Apr 17 07:52:56.073305 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:56.073268 2570 generic.go:358] "Generic (PLEG): container finished" podID="b042621c-fbf8-4739-b758-0e481535b940" containerID="4d6c3167d3c50ed75bc2cd365f557aa95e3fbcf425265fc88c5d75cd27009d69" exitCode=0 Apr 17 07:52:56.073844 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:56.073356 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgncd" event={"ID":"b042621c-fbf8-4739-b758-0e481535b940","Type":"ContainerDied","Data":"4d6c3167d3c50ed75bc2cd365f557aa95e3fbcf425265fc88c5d75cd27009d69"} Apr 17 07:52:57.078015 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:57.077974 2570 generic.go:358] "Generic (PLEG): container finished" podID="b042621c-fbf8-4739-b758-0e481535b940" containerID="6af37495ef29edee01448dea01be0d45a9f011e13211000e900d84316b254a9a" exitCode=0 Apr 17 07:52:57.078464 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:57.078014 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgncd" event={"ID":"b042621c-fbf8-4739-b758-0e481535b940","Type":"ContainerDied","Data":"6af37495ef29edee01448dea01be0d45a9f011e13211000e900d84316b254a9a"} Apr 17 07:52:57.372240 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:57.372156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:52:57.372240 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:57.372239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:52:57.372500 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:57.372285 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:52:57.372500 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:57.372367 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:52:57.372500 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:57.372380 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls podName:6e47d03e-fde2-4cd8-96cd-ec060cb7afb1 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:01.372361548 +0000 UTC m=+41.181372269 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls") pod "dns-default-vrvr9" (UID: "6e47d03e-fde2-4cd8-96cd-ec060cb7afb1") : secret "dns-default-metrics-tls" not found Apr 17 07:52:57.372500 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:52:57.372428 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert podName:77b6c827-22ca-45dc-8ce4-0267c31539a0 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:01.372413789 +0000 UTC m=+41.181424504 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert") pod "ingress-canary-qnv6r" (UID: "77b6c827-22ca-45dc-8ce4-0267c31539a0") : secret "canary-serving-cert" not found Apr 17 07:52:59.079205 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.079125 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh"] Apr 17 07:52:59.087605 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.087573 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x9bnq" event={"ID":"746ed911-6342-422a-910a-f742c65c2879","Type":"ContainerStarted","Data":"37089fb9e9e6050f37ccd99568ccb95134c2548394df0ff90f066b191cf37796"} Apr 17 07:52:59.087747 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.087608 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgncd" event={"ID":"b042621c-fbf8-4739-b758-0e481535b940","Type":"ContainerStarted","Data":"216f02a31abfc65a73d3e3967bbdf8a328618baba453282964658a6b78e2aa25"} Apr 17 07:52:59.087829 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.087808 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.089703 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.089667 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 07:52:59.089828 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.089730 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 07:52:59.089828 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.089768 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 07:52:59.090650 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.090626 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 07:52:59.090780 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.090654 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 07:52:59.090780 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.090700 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 07:52:59.090780 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.090751 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 07:52:59.091700 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.091680 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh"] Apr 17 07:52:59.107083 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.107033 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-x9bnq" podStartSLOduration=33.171326978 podStartE2EDuration="37.107018198s" podCreationTimestamp="2026-04-17 07:52:22 +0000 UTC" firstStartedPulling="2026-04-17 07:52:54.76028787 +0000 UTC m=+34.569298579" lastFinishedPulling="2026-04-17 07:52:58.695979077 +0000 UTC m=+38.504989799" observedRunningTime="2026-04-17 07:52:59.106795137 +0000 UTC m=+38.915805869" watchObservedRunningTime="2026-04-17 07:52:59.107018198 +0000 UTC m=+38.916028929" Apr 17 07:52:59.128702 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.128650 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jgncd" podStartSLOduration=6.331697944 podStartE2EDuration="39.1286337s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="2026-04-17 07:52:22.140332583 +0000 UTC m=+1.949343290" lastFinishedPulling="2026-04-17 07:52:54.937268323 +0000 UTC m=+34.746279046" observedRunningTime="2026-04-17 07:52:59.128288862 +0000 UTC m=+38.937299603" watchObservedRunningTime="2026-04-17 07:52:59.1286337 +0000 UTC m=+38.937644432" Apr 17 07:52:59.183735 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.183704 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdgpq\" (UniqueName: \"kubernetes.io/projected/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-kube-api-access-fdgpq\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.183882 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.183815 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-ca\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.183882 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.183845 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.183997 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.183977 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.184084 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.184068 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.184172 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.184159 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-hub\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.284721 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.284684 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.284859 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.284750 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-hub\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.284859 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.284785 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdgpq\" (UniqueName: \"kubernetes.io/projected/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-kube-api-access-fdgpq\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.284964 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.284911 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-ca\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.284964 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.284957 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.285031 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.284984 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.285652 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.285633 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.289148 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.289124 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.291902 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.291883 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdgpq\" (UniqueName: \"kubernetes.io/projected/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-kube-api-access-fdgpq\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.299363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.299341 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-hub\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.299440 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.299424 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-ca\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.299601 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.299581 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-cf899d868-wpbgh\" (UID: \"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.405801 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.405770 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:52:59.521883 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:52:59.521856 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh"] Apr 17 07:52:59.525244 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:52:59.525204 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89fcabc9_f5b9_4d2b_bfa4_7af1469abbf0.slice/crio-fd01f37679dab1292983c957a0d1bf59d82da914eec8459d6d32df28bc8df16e WatchSource:0}: Error finding container fd01f37679dab1292983c957a0d1bf59d82da914eec8459d6d32df28bc8df16e: Status 404 returned error can't find the container with id fd01f37679dab1292983c957a0d1bf59d82da914eec8459d6d32df28bc8df16e Apr 17 07:53:00.088801 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:00.088758 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" event={"ID":"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0","Type":"ContainerStarted","Data":"fd01f37679dab1292983c957a0d1bf59d82da914eec8459d6d32df28bc8df16e"} Apr 17 07:53:01.399103 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:01.399064 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:53:01.399541 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:01.399147 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:53:01.399541 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:01.399236 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:01.399541 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:01.399261 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:01.399541 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:01.399337 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert podName:77b6c827-22ca-45dc-8ce4-0267c31539a0 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:09.399298115 +0000 UTC m=+49.208308869 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert") pod "ingress-canary-qnv6r" (UID: "77b6c827-22ca-45dc-8ce4-0267c31539a0") : secret "canary-serving-cert" not found Apr 17 07:53:01.399541 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:01.399359 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls podName:6e47d03e-fde2-4cd8-96cd-ec060cb7afb1 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:09.399349235 +0000 UTC m=+49.208359949 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls") pod "dns-default-vrvr9" (UID: "6e47d03e-fde2-4cd8-96cd-ec060cb7afb1") : secret "dns-default-metrics-tls" not found Apr 17 07:53:03.096145 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:03.096116 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" event={"ID":"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0","Type":"ContainerStarted","Data":"abff524d63fb1305f1e1f80e692ea1fa145aa13248fd752389084ffcf0d7e9f9"} Apr 17 07:53:06.104118 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:06.104077 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" event={"ID":"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0","Type":"ContainerStarted","Data":"16379e44bc63ac04b818ee565bdcdf63ec74df3e31f38ba425d1a56669c16c50"} Apr 17 07:53:06.104118 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:06.104120 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" event={"ID":"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0","Type":"ContainerStarted","Data":"510566d710cb01c737573961ec15efd5c90d22364aaceb7e905a150c8e32ed12"} Apr 17 07:53:06.123698 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:06.123647 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" podStartSLOduration=1.350976973 podStartE2EDuration="7.123624687s" podCreationTimestamp="2026-04-17 07:52:59 +0000 UTC" firstStartedPulling="2026-04-17 07:52:59.527024945 +0000 UTC m=+39.336035660" lastFinishedPulling="2026-04-17 07:53:05.299672663 +0000 UTC m=+45.108683374" observedRunningTime="2026-04-17 07:53:06.122419329 +0000 UTC m=+45.931430060" watchObservedRunningTime="2026-04-17 07:53:06.123624687 +0000 UTC m=+45.932635417" Apr 17 07:53:09.460342 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:09.460282 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:53:09.460755 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:09.460413 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:53:09.460755 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:09.460426 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:09.460755 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:09.460486 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert podName:77b6c827-22ca-45dc-8ce4-0267c31539a0 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:25.460470475 +0000 UTC m=+65.269481183 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert") pod "ingress-canary-qnv6r" (UID: "77b6c827-22ca-45dc-8ce4-0267c31539a0") : secret "canary-serving-cert" not found Apr 17 07:53:09.460755 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:09.460545 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:09.460755 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:09.460601 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls podName:6e47d03e-fde2-4cd8-96cd-ec060cb7afb1 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:25.460585774 +0000 UTC m=+65.269596481 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls") pod "dns-default-vrvr9" (UID: "6e47d03e-fde2-4cd8-96cd-ec060cb7afb1") : secret "dns-default-metrics-tls" not found Apr 17 07:53:19.067571 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:19.067543 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x6ppg" Apr 17 07:53:25.471559 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:25.471518 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:53:25.472085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:25.471571 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:53:25.472085 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:25.471596 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:53:25.472085 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:25.471684 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:25.472085 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:25.471686 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:25.472085 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:25.471747 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls podName:6e47d03e-fde2-4cd8-96cd-ec060cb7afb1 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:57.471728802 +0000 UTC m=+97.280739528 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls") pod "dns-default-vrvr9" (UID: "6e47d03e-fde2-4cd8-96cd-ec060cb7afb1") : secret "dns-default-metrics-tls" not found Apr 17 07:53:25.472085 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:25.471762 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert podName:77b6c827-22ca-45dc-8ce4-0267c31539a0 nodeName:}" failed. No retries permitted until 2026-04-17 07:53:57.471755151 +0000 UTC m=+97.280765859 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert") pod "ingress-canary-qnv6r" (UID: "77b6c827-22ca-45dc-8ce4-0267c31539a0") : secret "canary-serving-cert" not found Apr 17 07:53:25.473885 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:25.473865 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 07:53:25.482214 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:25.482200 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:53:25.482294 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:25.482240 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs podName:b124ed47-021a-4bde-8c03-dcfce0f301d8 nodeName:}" failed. No retries permitted until 2026-04-17 07:54:29.482228219 +0000 UTC m=+129.291238926 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs") pod "network-metrics-daemon-7bs5q" (UID: "b124ed47-021a-4bde-8c03-dcfce0f301d8") : secret "metrics-daemon-secret" not found Apr 17 07:53:25.572705 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:25.572674 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcfw4\" (UniqueName: \"kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4\") pod \"network-check-target-dh54x\" (UID: \"7fe9b09c-633f-48ec-916f-04b365b73fcb\") " pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:53:25.574994 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:25.574978 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 07:53:25.585634 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:25.585615 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 07:53:25.596615 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:25.596598 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcfw4\" (UniqueName: \"kubernetes.io/projected/7fe9b09c-633f-48ec-916f-04b365b73fcb-kube-api-access-hcfw4\") pod \"network-check-target-dh54x\" (UID: \"7fe9b09c-633f-48ec-916f-04b365b73fcb\") " pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:53:25.653944 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:25.653918 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-n45c2\"" Apr 17 07:53:25.662714 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:25.662692 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:53:25.772616 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:25.772585 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dh54x"] Apr 17 07:53:25.776442 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:53:25.776418 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fe9b09c_633f_48ec_916f_04b365b73fcb.slice/crio-496662acc63afe33c448713fbdca39685d7b730def947b1b6f81d99d3c65e3ce WatchSource:0}: Error finding container 496662acc63afe33c448713fbdca39685d7b730def947b1b6f81d99d3c65e3ce: Status 404 returned error can't find the container with id 496662acc63afe33c448713fbdca39685d7b730def947b1b6f81d99d3c65e3ce Apr 17 07:53:26.143016 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:26.142984 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dh54x" event={"ID":"7fe9b09c-633f-48ec-916f-04b365b73fcb","Type":"ContainerStarted","Data":"496662acc63afe33c448713fbdca39685d7b730def947b1b6f81d99d3c65e3ce"} Apr 17 07:53:29.151163 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:29.151125 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dh54x" event={"ID":"7fe9b09c-633f-48ec-916f-04b365b73fcb","Type":"ContainerStarted","Data":"7f0c0af6131b451067e971382b0d92d62f44d896954c983a3aa36f68c653cce6"} Apr 17 07:53:29.151600 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:29.151244 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:53:29.165588 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:29.165542 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dh54x" podStartSLOduration=66.005448735 podStartE2EDuration="1m9.165528743s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="2026-04-17 07:53:25.778382083 +0000 UTC m=+65.587392803" lastFinishedPulling="2026-04-17 07:53:28.938462087 +0000 UTC m=+68.747472811" observedRunningTime="2026-04-17 07:53:29.164864752 +0000 UTC m=+68.973875482" watchObservedRunningTime="2026-04-17 07:53:29.165528743 +0000 UTC m=+68.974539472" Apr 17 07:53:57.501228 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:57.501095 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:53:57.501228 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:53:57.501157 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:53:57.501820 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:57.501242 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 07:53:57.501820 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:57.501298 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls podName:6e47d03e-fde2-4cd8-96cd-ec060cb7afb1 nodeName:}" failed. No retries permitted until 2026-04-17 07:55:01.50128422 +0000 UTC m=+161.310294928 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls") pod "dns-default-vrvr9" (UID: "6e47d03e-fde2-4cd8-96cd-ec060cb7afb1") : secret "dns-default-metrics-tls" not found Apr 17 07:53:57.501820 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:57.501242 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 07:53:57.501820 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:53:57.501399 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert podName:77b6c827-22ca-45dc-8ce4-0267c31539a0 nodeName:}" failed. No retries permitted until 2026-04-17 07:55:01.501382077 +0000 UTC m=+161.310392795 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert") pod "ingress-canary-qnv6r" (UID: "77b6c827-22ca-45dc-8ce4-0267c31539a0") : secret "canary-serving-cert" not found Apr 17 07:54:00.156591 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:00.156558 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dh54x" Apr 17 07:54:29.523277 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:29.523238 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:54:29.523849 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:54:29.523461 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 07:54:29.523849 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:54:29.523548 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs podName:b124ed47-021a-4bde-8c03-dcfce0f301d8 nodeName:}" failed. No retries permitted until 2026-04-17 07:56:31.523522409 +0000 UTC m=+251.332533128 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs") pod "network-metrics-daemon-7bs5q" (UID: "b124ed47-021a-4bde-8c03-dcfce0f301d8") : secret "metrics-daemon-secret" not found Apr 17 07:54:43.962748 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:43.962715 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dz2t5"] Apr 17 07:54:43.965525 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:43.965496 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dz2t5" Apr 17 07:54:43.968222 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:43.968202 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 07:54:43.968481 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:43.968457 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 07:54:43.972029 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:43.969541 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-4z6g7\"" Apr 17 07:54:43.975562 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:43.975540 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dz2t5"] Apr 17 07:54:44.023471 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:44.023445 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s92vv\" (UniqueName: \"kubernetes.io/projected/b20c27f2-f742-43ff-a651-7ac3dbcd83d8-kube-api-access-s92vv\") pod \"volume-data-source-validator-7c6cbb6c87-dz2t5\" (UID: \"b20c27f2-f742-43ff-a651-7ac3dbcd83d8\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dz2t5" Apr 17 07:54:44.124077 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:44.124053 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s92vv\" (UniqueName: \"kubernetes.io/projected/b20c27f2-f742-43ff-a651-7ac3dbcd83d8-kube-api-access-s92vv\") pod \"volume-data-source-validator-7c6cbb6c87-dz2t5\" (UID: \"b20c27f2-f742-43ff-a651-7ac3dbcd83d8\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dz2t5" Apr 17 07:54:44.132716 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:44.132687 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s92vv\" (UniqueName: \"kubernetes.io/projected/b20c27f2-f742-43ff-a651-7ac3dbcd83d8-kube-api-access-s92vv\") pod \"volume-data-source-validator-7c6cbb6c87-dz2t5\" (UID: \"b20c27f2-f742-43ff-a651-7ac3dbcd83d8\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dz2t5" Apr 17 07:54:44.277128 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:44.277049 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dz2t5" Apr 17 07:54:44.402197 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:44.402173 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dz2t5"] Apr 17 07:54:44.404577 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:54:44.404551 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb20c27f2_f742_43ff_a651_7ac3dbcd83d8.slice/crio-ce896e3591afc7a988cc2862a77cb3c8b288e480d5b84d6a84b3f59c7640fdfd WatchSource:0}: Error finding container ce896e3591afc7a988cc2862a77cb3c8b288e480d5b84d6a84b3f59c7640fdfd: Status 404 returned error can't find the container with id ce896e3591afc7a988cc2862a77cb3c8b288e480d5b84d6a84b3f59c7640fdfd Apr 17 07:54:45.301562 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:45.301522 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dz2t5" event={"ID":"b20c27f2-f742-43ff-a651-7ac3dbcd83d8","Type":"ContainerStarted","Data":"ce896e3591afc7a988cc2862a77cb3c8b288e480d5b84d6a84b3f59c7640fdfd"} Apr 17 07:54:46.305357 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:46.305304 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dz2t5" event={"ID":"b20c27f2-f742-43ff-a651-7ac3dbcd83d8","Type":"ContainerStarted","Data":"8d1daa151267c47814238eab3a9ea6ea2b138cccf88ec741729885b77216f61c"} Apr 17 07:54:46.327894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:46.327764 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dz2t5" podStartSLOduration=2.13490086 podStartE2EDuration="3.327748412s" podCreationTimestamp="2026-04-17 07:54:43 +0000 UTC" firstStartedPulling="2026-04-17 07:54:44.406339036 +0000 UTC m=+144.215349744" lastFinishedPulling="2026-04-17 07:54:45.599186578 +0000 UTC m=+145.408197296" observedRunningTime="2026-04-17 07:54:46.326914009 +0000 UTC m=+146.135924737" watchObservedRunningTime="2026-04-17 07:54:46.327748412 +0000 UTC m=+146.136759143" Apr 17 07:54:49.247664 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:49.247628 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sbpvg"] Apr 17 07:54:49.250547 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:49.250531 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sbpvg" Apr 17 07:54:49.252823 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:49.252802 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-mbwll\"" Apr 17 07:54:49.252934 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:49.252823 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 07:54:49.253532 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:49.253518 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 07:54:49.260569 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:49.260549 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sbpvg"] Apr 17 07:54:49.359890 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:49.359860 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-466gk\" (UniqueName: \"kubernetes.io/projected/e854e687-08d0-4a97-876c-03bc44afc12b-kube-api-access-466gk\") pod \"migrator-74bb7799d9-sbpvg\" (UID: \"e854e687-08d0-4a97-876c-03bc44afc12b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sbpvg" Apr 17 07:54:49.460846 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:49.460806 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-466gk\" (UniqueName: \"kubernetes.io/projected/e854e687-08d0-4a97-876c-03bc44afc12b-kube-api-access-466gk\") pod \"migrator-74bb7799d9-sbpvg\" (UID: \"e854e687-08d0-4a97-876c-03bc44afc12b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sbpvg" Apr 17 07:54:49.472350 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:49.472301 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-466gk\" (UniqueName: \"kubernetes.io/projected/e854e687-08d0-4a97-876c-03bc44afc12b-kube-api-access-466gk\") pod \"migrator-74bb7799d9-sbpvg\" (UID: \"e854e687-08d0-4a97-876c-03bc44afc12b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sbpvg" Apr 17 07:54:49.559066 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:49.558990 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sbpvg" Apr 17 07:54:49.672511 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:49.672483 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sbpvg"] Apr 17 07:54:49.676793 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:54:49.676765 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode854e687_08d0_4a97_876c_03bc44afc12b.slice/crio-fcf6ae568a1545b6a7bd57b601c889a621f96455116a07bb78228e8b452d3aa4 WatchSource:0}: Error finding container fcf6ae568a1545b6a7bd57b601c889a621f96455116a07bb78228e8b452d3aa4: Status 404 returned error can't find the container with id fcf6ae568a1545b6a7bd57b601c889a621f96455116a07bb78228e8b452d3aa4 Apr 17 07:54:50.316669 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:50.316627 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sbpvg" event={"ID":"e854e687-08d0-4a97-876c-03bc44afc12b","Type":"ContainerStarted","Data":"fcf6ae568a1545b6a7bd57b601c889a621f96455116a07bb78228e8b452d3aa4"} Apr 17 07:54:50.508267 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:50.508234 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5v2tv_236582ca-8db3-417e-b2cf-dc0053b8afcf/dns-node-resolver/0.log" Apr 17 07:54:51.324019 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:51.323980 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sbpvg" event={"ID":"e854e687-08d0-4a97-876c-03bc44afc12b","Type":"ContainerStarted","Data":"3e623c5fb9acc83ba373538a6b3e7bcba47a620002ae105d3f3db4ddfc4a2c92"} Apr 17 07:54:51.324019 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:51.324015 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sbpvg" event={"ID":"e854e687-08d0-4a97-876c-03bc44afc12b","Type":"ContainerStarted","Data":"931ddc77cbe09b63d729ac9e39bb8dda21b91ba7e7cc25f69a83bc7c045d321e"} Apr 17 07:54:51.341672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:51.341628 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sbpvg" podStartSLOduration=1.276988049 podStartE2EDuration="2.341616203s" podCreationTimestamp="2026-04-17 07:54:49 +0000 UTC" firstStartedPulling="2026-04-17 07:54:49.678614184 +0000 UTC m=+149.487624892" lastFinishedPulling="2026-04-17 07:54:50.743242336 +0000 UTC m=+150.552253046" observedRunningTime="2026-04-17 07:54:51.340130046 +0000 UTC m=+151.149140776" watchObservedRunningTime="2026-04-17 07:54:51.341616203 +0000 UTC m=+151.150626932" Apr 17 07:54:51.508678 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:51.508650 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4tp7h_c6a74de4-3b80-4092-8de3-2a795216aa48/node-ca/0.log" Apr 17 07:54:56.638274 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:54:56.638239 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-vrvr9" podUID="6e47d03e-fde2-4cd8-96cd-ec060cb7afb1" Apr 17 07:54:56.652414 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:54:56.652385 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-qnv6r" podUID="77b6c827-22ca-45dc-8ce4-0267c31539a0" Apr 17 07:54:56.838190 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:54:56.838155 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7bs5q" podUID="b124ed47-021a-4bde-8c03-dcfce0f301d8" Apr 17 07:54:57.339942 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:54:57.339909 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vrvr9" Apr 17 07:55:01.548160 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:01.548125 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:55:01.548565 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:01.548174 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:55:01.550391 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:01.550358 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e47d03e-fde2-4cd8-96cd-ec060cb7afb1-metrics-tls\") pod \"dns-default-vrvr9\" (UID: \"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1\") " pod="openshift-dns/dns-default-vrvr9" Apr 17 07:55:01.550576 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:01.550558 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77b6c827-22ca-45dc-8ce4-0267c31539a0-cert\") pod \"ingress-canary-qnv6r\" (UID: \"77b6c827-22ca-45dc-8ce4-0267c31539a0\") " pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:55:01.843123 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:01.843040 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-d4l2f\"" Apr 17 07:55:01.850914 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:01.850897 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vrvr9" Apr 17 07:55:01.961549 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:01.961516 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vrvr9"] Apr 17 07:55:01.965416 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:55:01.965383 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e47d03e_fde2_4cd8_96cd_ec060cb7afb1.slice/crio-4dddcb0b6f76fe3d5f7e1fe0d61b86fce24dd62e9f46e26045871dd240b65b3d WatchSource:0}: Error finding container 4dddcb0b6f76fe3d5f7e1fe0d61b86fce24dd62e9f46e26045871dd240b65b3d: Status 404 returned error can't find the container with id 4dddcb0b6f76fe3d5f7e1fe0d61b86fce24dd62e9f46e26045871dd240b65b3d Apr 17 07:55:02.355145 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:02.355092 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vrvr9" event={"ID":"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1","Type":"ContainerStarted","Data":"4dddcb0b6f76fe3d5f7e1fe0d61b86fce24dd62e9f46e26045871dd240b65b3d"} Apr 17 07:55:03.359179 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:03.359149 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vrvr9" event={"ID":"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1","Type":"ContainerStarted","Data":"e4572cb2415f90c72bb9b2e1da59019d8cabb51558162f024aa50e7b502e1654"} Apr 17 07:55:04.363113 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:04.363074 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vrvr9" event={"ID":"6e47d03e-fde2-4cd8-96cd-ec060cb7afb1","Type":"ContainerStarted","Data":"a2b7033accbb8d66e42e55cc801ab34e0d2e0fec549a942e8ef76664d23436c2"} Apr 17 07:55:04.363501 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:04.363200 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vrvr9" Apr 17 07:55:04.379721 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:04.379678 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vrvr9" podStartSLOduration=130.201123704 podStartE2EDuration="2m11.379666169s" podCreationTimestamp="2026-04-17 07:52:53 +0000 UTC" firstStartedPulling="2026-04-17 07:55:01.967220032 +0000 UTC m=+161.776230741" lastFinishedPulling="2026-04-17 07:55:03.145762498 +0000 UTC m=+162.954773206" observedRunningTime="2026-04-17 07:55:04.379609622 +0000 UTC m=+164.188620352" watchObservedRunningTime="2026-04-17 07:55:04.379666169 +0000 UTC m=+164.188676890" Apr 17 07:55:06.825084 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:06.825045 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:55:06.827761 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:06.827742 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fmxgb\"" Apr 17 07:55:06.835871 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:06.835855 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qnv6r" Apr 17 07:55:06.953155 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:06.953122 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qnv6r"] Apr 17 07:55:06.955997 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:55:06.955973 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77b6c827_22ca_45dc_8ce4_0267c31539a0.slice/crio-21eecc9d949922faa5b78094b3bf716280724f7681c01851ff078b7618a4c8e8 WatchSource:0}: Error finding container 21eecc9d949922faa5b78094b3bf716280724f7681c01851ff078b7618a4c8e8: Status 404 returned error can't find the container with id 21eecc9d949922faa5b78094b3bf716280724f7681c01851ff078b7618a4c8e8 Apr 17 07:55:07.372892 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:07.372853 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qnv6r" event={"ID":"77b6c827-22ca-45dc-8ce4-0267c31539a0","Type":"ContainerStarted","Data":"21eecc9d949922faa5b78094b3bf716280724f7681c01851ff078b7618a4c8e8"} Apr 17 07:55:09.381911 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:09.381839 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qnv6r" event={"ID":"77b6c827-22ca-45dc-8ce4-0267c31539a0","Type":"ContainerStarted","Data":"a1c39d5be49883b334f0f3435a5bb4dc07760a483ceb52e3e23c6d7a2f42087f"} Apr 17 07:55:09.396658 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:09.396614 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qnv6r" podStartSLOduration=134.997282667 podStartE2EDuration="2m16.396600233s" podCreationTimestamp="2026-04-17 07:52:53 +0000 UTC" firstStartedPulling="2026-04-17 07:55:06.957847858 +0000 UTC m=+166.766858566" lastFinishedPulling="2026-04-17 07:55:08.357165415 +0000 UTC m=+168.166176132" observedRunningTime="2026-04-17 07:55:09.396026991 +0000 UTC m=+169.205037720" watchObservedRunningTime="2026-04-17 07:55:09.396600233 +0000 UTC m=+169.205610961" Apr 17 07:55:11.825292 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:11.825246 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:55:11.996469 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:11.996442 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7n7vn"] Apr 17 07:55:11.999509 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:11.999493 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.001902 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.001866 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-54hk8\"" Apr 17 07:55:12.001902 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.001890 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 07:55:12.003023 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.003005 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 07:55:12.003140 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.003051 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 07:55:12.003140 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.003060 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 07:55:12.009759 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.009729 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7n7vn"] Apr 17 07:55:12.115960 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.115883 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b45jm\" (UniqueName: \"kubernetes.io/projected/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-kube-api-access-b45jm\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.115960 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.115915 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.115960 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.115933 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.116198 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.116015 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-crio-socket\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.116198 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.116062 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-data-volume\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.217223 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.217192 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b45jm\" (UniqueName: \"kubernetes.io/projected/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-kube-api-access-b45jm\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.217223 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.217225 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.217447 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.217242 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.217447 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.217260 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-crio-socket\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.217447 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.217280 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-data-volume\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.217447 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.217433 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-crio-socket\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.217664 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.217645 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-data-volume\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.217847 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.217820 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.219602 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.219579 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.225645 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.225620 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b45jm\" (UniqueName: \"kubernetes.io/projected/bbf30a91-db8c-4bc8-a486-9fa0c9c9b449-kube-api-access-b45jm\") pod \"insights-runtime-extractor-7n7vn\" (UID: \"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449\") " pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.307987 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.307968 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7n7vn" Apr 17 07:55:12.421182 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:12.421154 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7n7vn"] Apr 17 07:55:12.424521 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:55:12.424498 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbf30a91_db8c_4bc8_a486_9fa0c9c9b449.slice/crio-dfb6574c0e3b42dca54a68c6760765568d4e497af6b3ffb97f05dcec55c7ae41 WatchSource:0}: Error finding container dfb6574c0e3b42dca54a68c6760765568d4e497af6b3ffb97f05dcec55c7ae41: Status 404 returned error can't find the container with id dfb6574c0e3b42dca54a68c6760765568d4e497af6b3ffb97f05dcec55c7ae41 Apr 17 07:55:13.392077 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:13.391997 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7n7vn" event={"ID":"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449","Type":"ContainerStarted","Data":"928b50972613545f3826f68994fa05297d48a8966346aede41a2264c6ff9b0c5"} Apr 17 07:55:13.392077 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:13.392032 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7n7vn" event={"ID":"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449","Type":"ContainerStarted","Data":"947b10882c9e874a65876ae4d59efb9138fd9da07e1fe1c5a7fe0778f4d12d54"} Apr 17 07:55:13.392077 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:13.392041 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7n7vn" event={"ID":"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449","Type":"ContainerStarted","Data":"dfb6574c0e3b42dca54a68c6760765568d4e497af6b3ffb97f05dcec55c7ae41"} Apr 17 07:55:14.367993 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:14.367960 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vrvr9" Apr 17 07:55:15.399248 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:15.399207 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7n7vn" event={"ID":"bbf30a91-db8c-4bc8-a486-9fa0c9c9b449","Type":"ContainerStarted","Data":"1f5bb7b4e0d3950d87517e4812e0be5ed5dfb5f474457c0cb56754d13b67c27d"} Apr 17 07:55:15.415775 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:15.415724 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7n7vn" podStartSLOduration=2.422699812 podStartE2EDuration="4.415708035s" podCreationTimestamp="2026-04-17 07:55:11 +0000 UTC" firstStartedPulling="2026-04-17 07:55:12.480779265 +0000 UTC m=+172.289789974" lastFinishedPulling="2026-04-17 07:55:14.473787488 +0000 UTC m=+174.282798197" observedRunningTime="2026-04-17 07:55:15.415515549 +0000 UTC m=+175.224526291" watchObservedRunningTime="2026-04-17 07:55:15.415708035 +0000 UTC m=+175.224718767" Apr 17 07:55:25.390374 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.390341 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2bbh6"] Apr 17 07:55:25.392891 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.392875 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.395228 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.395209 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 07:55:25.395530 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.395511 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bpbfn\"" Apr 17 07:55:25.395668 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.395542 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 07:55:25.396025 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.396006 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 07:55:25.396114 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.396009 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 07:55:25.396562 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.396548 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 07:55:25.396828 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.396814 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 07:55:25.507491 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.507454 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c534ab59-8b06-4344-a4a6-61cdbdfe5340-root\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.507665 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.507512 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-accelerators-collector-config\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.507665 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.507534 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-textfile\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.507665 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.507615 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-wtmp\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.507665 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.507661 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-tls\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.507804 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.507693 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c534ab59-8b06-4344-a4a6-61cdbdfe5340-metrics-client-ca\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.507804 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.507710 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.507804 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.507732 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c534ab59-8b06-4344-a4a6-61cdbdfe5340-sys\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.507804 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.507758 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lctps\" (UniqueName: \"kubernetes.io/projected/c534ab59-8b06-4344-a4a6-61cdbdfe5340-kube-api-access-lctps\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608197 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c534ab59-8b06-4344-a4a6-61cdbdfe5340-metrics-client-ca\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608197 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608458 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608216 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c534ab59-8b06-4344-a4a6-61cdbdfe5340-sys\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608458 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lctps\" (UniqueName: \"kubernetes.io/projected/c534ab59-8b06-4344-a4a6-61cdbdfe5340-kube-api-access-lctps\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608458 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608336 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c534ab59-8b06-4344-a4a6-61cdbdfe5340-root\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608458 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608391 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-accelerators-collector-config\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608712 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608463 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c534ab59-8b06-4344-a4a6-61cdbdfe5340-root\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608712 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608523 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-textfile\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608712 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608550 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c534ab59-8b06-4344-a4a6-61cdbdfe5340-sys\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608712 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608571 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-wtmp\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608712 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608610 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-tls\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608712 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608693 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-wtmp\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608970 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608815 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-textfile\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608970 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608854 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c534ab59-8b06-4344-a4a6-61cdbdfe5340-metrics-client-ca\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.608970 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.608949 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-accelerators-collector-config\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.610537 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.610510 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.610768 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.610753 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c534ab59-8b06-4344-a4a6-61cdbdfe5340-node-exporter-tls\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.617298 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.617272 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lctps\" (UniqueName: \"kubernetes.io/projected/c534ab59-8b06-4344-a4a6-61cdbdfe5340-kube-api-access-lctps\") pod \"node-exporter-2bbh6\" (UID: \"c534ab59-8b06-4344-a4a6-61cdbdfe5340\") " pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.701765 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:25.701697 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2bbh6" Apr 17 07:55:25.709566 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:55:25.709540 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc534ab59_8b06_4344_a4a6_61cdbdfe5340.slice/crio-639fd8fb210611dd87e7bd2e41a1d6ab844fc91a910c5d14e5f53c29a8d290e7 WatchSource:0}: Error finding container 639fd8fb210611dd87e7bd2e41a1d6ab844fc91a910c5d14e5f53c29a8d290e7: Status 404 returned error can't find the container with id 639fd8fb210611dd87e7bd2e41a1d6ab844fc91a910c5d14e5f53c29a8d290e7 Apr 17 07:55:26.427889 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:26.427849 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2bbh6" event={"ID":"c534ab59-8b06-4344-a4a6-61cdbdfe5340","Type":"ContainerStarted","Data":"639fd8fb210611dd87e7bd2e41a1d6ab844fc91a910c5d14e5f53c29a8d290e7"} Apr 17 07:55:27.357091 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.357058 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd"] Apr 17 07:55:27.360429 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.360413 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.362754 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.362733 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 07:55:27.362982 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.362950 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-r4swd\"" Apr 17 07:55:27.363215 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.363197 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 07:55:27.363298 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.363261 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-fkct393kelouq\"" Apr 17 07:55:27.363542 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.363525 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 07:55:27.363617 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.363549 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 07:55:27.363617 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.363567 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 07:55:27.370507 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.370489 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd"] Apr 17 07:55:27.421635 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.421596 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.421635 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.421634 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.421836 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.421658 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.421836 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.421767 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.421836 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.421798 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56926dfe-e5dc-4b01-abb6-98d9a1516d23-metrics-client-ca\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.421836 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.421828 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldktm\" (UniqueName: \"kubernetes.io/projected/56926dfe-e5dc-4b01-abb6-98d9a1516d23-kube-api-access-ldktm\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.422056 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.421862 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-tls\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.422056 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.421880 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-grpc-tls\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.431979 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.431947 2570 generic.go:358] "Generic (PLEG): container finished" podID="c534ab59-8b06-4344-a4a6-61cdbdfe5340" containerID="612fd3293dcfbff116eb3f682512c0e4faa47478ef4636bb70de1486a22ae930" exitCode=0 Apr 17 07:55:27.432347 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.431982 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2bbh6" event={"ID":"c534ab59-8b06-4344-a4a6-61cdbdfe5340","Type":"ContainerDied","Data":"612fd3293dcfbff116eb3f682512c0e4faa47478ef4636bb70de1486a22ae930"} Apr 17 07:55:27.523115 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.523087 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.523221 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.523132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56926dfe-e5dc-4b01-abb6-98d9a1516d23-metrics-client-ca\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.523221 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.523181 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldktm\" (UniqueName: \"kubernetes.io/projected/56926dfe-e5dc-4b01-abb6-98d9a1516d23-kube-api-access-ldktm\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.523221 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.523217 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-tls\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.523378 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.523241 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-grpc-tls\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.523378 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.523288 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.523378 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.523357 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.523534 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.523389 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.523942 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.523914 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56926dfe-e5dc-4b01-abb6-98d9a1516d23-metrics-client-ca\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.526252 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.526226 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-grpc-tls\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.526448 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.526429 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.526530 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.526514 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.526582 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.526521 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-tls\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.526927 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.526904 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.527038 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.527016 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/56926dfe-e5dc-4b01-abb6-98d9a1516d23-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.532304 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.532281 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldktm\" (UniqueName: \"kubernetes.io/projected/56926dfe-e5dc-4b01-abb6-98d9a1516d23-kube-api-access-ldktm\") pod \"thanos-querier-6ccbbc7cfd-7pkwd\" (UID: \"56926dfe-e5dc-4b01-abb6-98d9a1516d23\") " pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.669563 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.669530 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:27.788779 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:27.788743 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd"] Apr 17 07:55:27.793183 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:55:27.793152 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56926dfe_e5dc_4b01_abb6_98d9a1516d23.slice/crio-6dc4b4effc951f9651ad03e19a961c18e7736ed6458724440509a3b27c16e080 WatchSource:0}: Error finding container 6dc4b4effc951f9651ad03e19a961c18e7736ed6458724440509a3b27c16e080: Status 404 returned error can't find the container with id 6dc4b4effc951f9651ad03e19a961c18e7736ed6458724440509a3b27c16e080 Apr 17 07:55:28.436053 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:28.436013 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" event={"ID":"56926dfe-e5dc-4b01-abb6-98d9a1516d23","Type":"ContainerStarted","Data":"6dc4b4effc951f9651ad03e19a961c18e7736ed6458724440509a3b27c16e080"} Apr 17 07:55:28.437946 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:28.437913 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2bbh6" event={"ID":"c534ab59-8b06-4344-a4a6-61cdbdfe5340","Type":"ContainerStarted","Data":"8cea3232e08aa70c077315f6599ee2731a293dc2978771ffa1c42ee073e5b614"} Apr 17 07:55:28.438066 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:28.437948 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2bbh6" event={"ID":"c534ab59-8b06-4344-a4a6-61cdbdfe5340","Type":"ContainerStarted","Data":"1e90c7a14fc00fa85158d43a86f2aeaae182cb96e716e17458e1458be1b92f8f"} Apr 17 07:55:28.458665 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:28.458607 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2bbh6" podStartSLOduration=2.656159156 podStartE2EDuration="3.458591923s" podCreationTimestamp="2026-04-17 07:55:25 +0000 UTC" firstStartedPulling="2026-04-17 07:55:25.711203418 +0000 UTC m=+185.520214130" lastFinishedPulling="2026-04-17 07:55:26.513636179 +0000 UTC m=+186.322646897" observedRunningTime="2026-04-17 07:55:28.457538628 +0000 UTC m=+188.266549372" watchObservedRunningTime="2026-04-17 07:55:28.458591923 +0000 UTC m=+188.267602652" Apr 17 07:55:29.407327 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.407242 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" podUID="89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 07:55:29.782676 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.782652 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-8cf54c56c-5tfr4"] Apr 17 07:55:29.785519 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.785501 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:29.787784 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.787763 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 07:55:29.788541 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.788523 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-53po8mcq6v1ds\"" Apr 17 07:55:29.788618 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.788546 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 07:55:29.788873 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.788855 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 07:55:29.788985 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.788909 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 07:55:29.789051 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.788982 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-m95zh\"" Apr 17 07:55:29.795222 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.795197 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8cf54c56c-5tfr4"] Apr 17 07:55:29.944745 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.944712 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9e93a016-654f-4003-b4eb-45f420b5b7ec-metrics-server-audit-profiles\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:29.944890 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.944755 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e93a016-654f-4003-b4eb-45f420b5b7ec-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:29.944890 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.944805 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e93a016-654f-4003-b4eb-45f420b5b7ec-client-ca-bundle\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:29.944890 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.944877 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9e93a016-654f-4003-b4eb-45f420b5b7ec-secret-metrics-server-client-certs\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:29.945061 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.944906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9e93a016-654f-4003-b4eb-45f420b5b7ec-secret-metrics-server-tls\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:29.945061 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.944931 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9e93a016-654f-4003-b4eb-45f420b5b7ec-audit-log\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:29.945061 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:29.944992 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt4b6\" (UniqueName: \"kubernetes.io/projected/9e93a016-654f-4003-b4eb-45f420b5b7ec-kube-api-access-rt4b6\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.046169 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.046077 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt4b6\" (UniqueName: \"kubernetes.io/projected/9e93a016-654f-4003-b4eb-45f420b5b7ec-kube-api-access-rt4b6\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.046169 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.046121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9e93a016-654f-4003-b4eb-45f420b5b7ec-metrics-server-audit-profiles\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.046169 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.046156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e93a016-654f-4003-b4eb-45f420b5b7ec-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.046478 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.046184 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e93a016-654f-4003-b4eb-45f420b5b7ec-client-ca-bundle\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.046478 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.046257 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9e93a016-654f-4003-b4eb-45f420b5b7ec-secret-metrics-server-client-certs\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.046478 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.046284 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9e93a016-654f-4003-b4eb-45f420b5b7ec-secret-metrics-server-tls\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.046478 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.046308 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9e93a016-654f-4003-b4eb-45f420b5b7ec-audit-log\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.046860 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.046827 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9e93a016-654f-4003-b4eb-45f420b5b7ec-audit-log\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.047098 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.047075 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e93a016-654f-4003-b4eb-45f420b5b7ec-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.047279 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.047259 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9e93a016-654f-4003-b4eb-45f420b5b7ec-metrics-server-audit-profiles\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.049202 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.048837 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9e93a016-654f-4003-b4eb-45f420b5b7ec-secret-metrics-server-tls\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.049202 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.049170 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9e93a016-654f-4003-b4eb-45f420b5b7ec-secret-metrics-server-client-certs\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.049366 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.049346 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e93a016-654f-4003-b4eb-45f420b5b7ec-client-ca-bundle\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.053752 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.053731 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt4b6\" (UniqueName: \"kubernetes.io/projected/9e93a016-654f-4003-b4eb-45f420b5b7ec-kube-api-access-rt4b6\") pod \"metrics-server-8cf54c56c-5tfr4\" (UID: \"9e93a016-654f-4003-b4eb-45f420b5b7ec\") " pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.095290 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.095250 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:30.140973 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.140939 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp"] Apr 17 07:55:30.145707 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.145680 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp" Apr 17 07:55:30.148190 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.148109 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 07:55:30.148335 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.148189 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-cj5r4\"" Apr 17 07:55:30.157462 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.157435 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp"] Apr 17 07:55:30.214687 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.214643 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8cf54c56c-5tfr4"] Apr 17 07:55:30.217830 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:55:30.217799 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e93a016_654f_4003_b4eb_45f420b5b7ec.slice/crio-66f0e8e198053c34dfd079c1c79144afa17a28d4f2b03fbbb7cbde35061e8639 WatchSource:0}: Error finding container 66f0e8e198053c34dfd079c1c79144afa17a28d4f2b03fbbb7cbde35061e8639: Status 404 returned error can't find the container with id 66f0e8e198053c34dfd079c1c79144afa17a28d4f2b03fbbb7cbde35061e8639 Apr 17 07:55:30.248102 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.248065 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab3c33d3-4737-4053-8863-52fa1498a50d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8d8lp\" (UID: \"ab3c33d3-4737-4053-8863-52fa1498a50d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp" Apr 17 07:55:30.349508 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.349417 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab3c33d3-4737-4053-8863-52fa1498a50d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8d8lp\" (UID: \"ab3c33d3-4737-4053-8863-52fa1498a50d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp" Apr 17 07:55:30.349654 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:55:30.349577 2570 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 07:55:30.349654 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:55:30.349644 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab3c33d3-4737-4053-8863-52fa1498a50d-monitoring-plugin-cert podName:ab3c33d3-4737-4053-8863-52fa1498a50d nodeName:}" failed. No retries permitted until 2026-04-17 07:55:30.849626141 +0000 UTC m=+190.658636864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/ab3c33d3-4737-4053-8863-52fa1498a50d-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-8d8lp" (UID: "ab3c33d3-4737-4053-8863-52fa1498a50d") : secret "monitoring-plugin-cert" not found Apr 17 07:55:30.445363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.445330 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" event={"ID":"56926dfe-e5dc-4b01-abb6-98d9a1516d23","Type":"ContainerStarted","Data":"da450f0937d3d09632089f65515571e01d66d14c3aa4c7279dff84ae36d19649"} Apr 17 07:55:30.445363 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.445368 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" event={"ID":"56926dfe-e5dc-4b01-abb6-98d9a1516d23","Type":"ContainerStarted","Data":"92373bfe74979f0d02298b67197993477da061d3f723bdfa743ca1bb77e7eeb8"} Apr 17 07:55:30.445623 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.445384 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" event={"ID":"56926dfe-e5dc-4b01-abb6-98d9a1516d23","Type":"ContainerStarted","Data":"a0047e0ea93207997d74f07fb4e9bbe27b9a24024d9b001a19a105c27e36f310"} Apr 17 07:55:30.446207 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.446185 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" event={"ID":"9e93a016-654f-4003-b4eb-45f420b5b7ec","Type":"ContainerStarted","Data":"66f0e8e198053c34dfd079c1c79144afa17a28d4f2b03fbbb7cbde35061e8639"} Apr 17 07:55:30.854044 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.854011 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab3c33d3-4737-4053-8863-52fa1498a50d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8d8lp\" (UID: \"ab3c33d3-4737-4053-8863-52fa1498a50d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp" Apr 17 07:55:30.857125 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:30.857102 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ab3c33d3-4737-4053-8863-52fa1498a50d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8d8lp\" (UID: \"ab3c33d3-4737-4053-8863-52fa1498a50d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp" Apr 17 07:55:31.058556 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:31.058465 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp" Apr 17 07:55:31.194481 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:31.194450 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp"] Apr 17 07:55:31.451975 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:31.451938 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" event={"ID":"56926dfe-e5dc-4b01-abb6-98d9a1516d23","Type":"ContainerStarted","Data":"8ef0e35f0fa552a5d798b9212512e9f0bc92ec0cf62e49c8746317b14b34aadf"} Apr 17 07:55:31.451975 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:31.451978 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" event={"ID":"56926dfe-e5dc-4b01-abb6-98d9a1516d23","Type":"ContainerStarted","Data":"efa9171efd409ba96d2bd2e9f6001e2bf09254a8ed78e61a003a42808a4bd111"} Apr 17 07:55:31.452163 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:31.451988 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" event={"ID":"56926dfe-e5dc-4b01-abb6-98d9a1516d23","Type":"ContainerStarted","Data":"b430e9ae4589c13f6f8030199c3067b0b5bb980d30000094ef2b95bce0579f65"} Apr 17 07:55:31.452163 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:31.452157 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:31.472400 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:31.472333 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" podStartSLOduration=1.488243974 podStartE2EDuration="4.472301417s" podCreationTimestamp="2026-04-17 07:55:27 +0000 UTC" firstStartedPulling="2026-04-17 07:55:27.795119637 +0000 UTC m=+187.604130347" lastFinishedPulling="2026-04-17 07:55:30.779177059 +0000 UTC m=+190.588187790" observedRunningTime="2026-04-17 07:55:31.471427209 +0000 UTC m=+191.280437955" watchObservedRunningTime="2026-04-17 07:55:31.472301417 +0000 UTC m=+191.281312172" Apr 17 07:55:31.552359 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:55:31.552305 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab3c33d3_4737_4053_8863_52fa1498a50d.slice/crio-13234849892d1d0f1f7dec6a66edcf070d281fda9105e799194ca38a8014b847 WatchSource:0}: Error finding container 13234849892d1d0f1f7dec6a66edcf070d281fda9105e799194ca38a8014b847: Status 404 returned error can't find the container with id 13234849892d1d0f1f7dec6a66edcf070d281fda9105e799194ca38a8014b847 Apr 17 07:55:32.458476 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:32.457688 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" event={"ID":"9e93a016-654f-4003-b4eb-45f420b5b7ec","Type":"ContainerStarted","Data":"44c7f7f8ff1f3c6faba57dec45add173c0159035e64d4f5290e202fcadae5d96"} Apr 17 07:55:32.460368 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:32.460335 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp" event={"ID":"ab3c33d3-4737-4053-8863-52fa1498a50d","Type":"ContainerStarted","Data":"13234849892d1d0f1f7dec6a66edcf070d281fda9105e799194ca38a8014b847"} Apr 17 07:55:32.479850 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:32.479783 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" podStartSLOduration=2.100963462 podStartE2EDuration="3.479762593s" podCreationTimestamp="2026-04-17 07:55:29 +0000 UTC" firstStartedPulling="2026-04-17 07:55:30.21963384 +0000 UTC m=+190.028644549" lastFinishedPulling="2026-04-17 07:55:31.598432969 +0000 UTC m=+191.407443680" observedRunningTime="2026-04-17 07:55:32.478871849 +0000 UTC m=+192.287882580" watchObservedRunningTime="2026-04-17 07:55:32.479762593 +0000 UTC m=+192.288773325" Apr 17 07:55:33.464295 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:33.464257 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp" event={"ID":"ab3c33d3-4737-4053-8863-52fa1498a50d","Type":"ContainerStarted","Data":"4d2d1d849954c777abef791f10731ec5cd49bf263f9d3ee8961883503e0282b4"} Apr 17 07:55:33.464647 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:33.464455 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp" Apr 17 07:55:33.469140 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:33.469120 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp" Apr 17 07:55:33.482303 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:33.482259 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8d8lp" podStartSLOduration=2.368657493 podStartE2EDuration="3.482248328s" podCreationTimestamp="2026-04-17 07:55:30 +0000 UTC" firstStartedPulling="2026-04-17 07:55:31.554052817 +0000 UTC m=+191.363063525" lastFinishedPulling="2026-04-17 07:55:32.66764364 +0000 UTC m=+192.476654360" observedRunningTime="2026-04-17 07:55:33.481239986 +0000 UTC m=+193.290250717" watchObservedRunningTime="2026-04-17 07:55:33.482248328 +0000 UTC m=+193.291259107" Apr 17 07:55:37.465941 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:37.465914 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6ccbbc7cfd-7pkwd" Apr 17 07:55:39.407459 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:39.407418 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" podUID="89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 07:55:44.599652 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.599620 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-79997b875-c2s2f"] Apr 17 07:55:44.602195 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.602177 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.604493 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.604470 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 07:55:44.605217 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.605199 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 07:55:44.605292 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.605213 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 07:55:44.605292 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.605204 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-j6lrp\"" Apr 17 07:55:44.605432 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.605292 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 07:55:44.605432 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.605394 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 07:55:44.605432 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.605399 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 07:55:44.605432 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.605400 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 07:55:44.608892 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.608859 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 07:55:44.610805 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.610786 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79997b875-c2s2f"] Apr 17 07:55:44.766456 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.766416 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-oauth-config\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.766456 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.766454 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-trusted-ca-bundle\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.766655 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.766483 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-service-ca\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.766655 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.766502 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddbsl\" (UniqueName: \"kubernetes.io/projected/f4eeb19a-4b47-4183-80d8-8e0b6967d094-kube-api-access-ddbsl\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.766655 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.766597 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-oauth-serving-cert\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.766655 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.766622 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-config\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.766655 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.766641 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-serving-cert\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.867380 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.867340 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-trusted-ca-bundle\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.867552 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.867384 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-service-ca\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.867552 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.867400 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddbsl\" (UniqueName: \"kubernetes.io/projected/f4eeb19a-4b47-4183-80d8-8e0b6967d094-kube-api-access-ddbsl\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.867552 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.867453 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-oauth-serving-cert\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.867552 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.867484 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-config\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.867552 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.867511 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-serving-cert\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.867552 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.867550 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-oauth-config\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.868201 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.868173 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-service-ca\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.868386 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.868212 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-oauth-serving-cert\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.868386 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.868278 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-trusted-ca-bundle\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.868496 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.868276 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-config\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.869896 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.869873 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-oauth-config\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.870027 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.870005 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-serving-cert\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.874809 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.874786 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddbsl\" (UniqueName: \"kubernetes.io/projected/f4eeb19a-4b47-4183-80d8-8e0b6967d094-kube-api-access-ddbsl\") pod \"console-79997b875-c2s2f\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:44.911881 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:44.911839 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:45.027963 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:45.027931 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79997b875-c2s2f"] Apr 17 07:55:45.030898 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:55:45.030866 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4eeb19a_4b47_4183_80d8_8e0b6967d094.slice/crio-08f6623228c35d8a55f95c1428d9ced72dc53368810b991d80ba06f7fbec7b03 WatchSource:0}: Error finding container 08f6623228c35d8a55f95c1428d9ced72dc53368810b991d80ba06f7fbec7b03: Status 404 returned error can't find the container with id 08f6623228c35d8a55f95c1428d9ced72dc53368810b991d80ba06f7fbec7b03 Apr 17 07:55:45.495894 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:45.495854 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79997b875-c2s2f" event={"ID":"f4eeb19a-4b47-4183-80d8-8e0b6967d094","Type":"ContainerStarted","Data":"08f6623228c35d8a55f95c1428d9ced72dc53368810b991d80ba06f7fbec7b03"} Apr 17 07:55:48.504519 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:48.504484 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79997b875-c2s2f" event={"ID":"f4eeb19a-4b47-4183-80d8-8e0b6967d094","Type":"ContainerStarted","Data":"2ad09ae7d9408af1c4f74d1305a5e8ed21864a76e1a58fad3535e2dce451d7ab"} Apr 17 07:55:48.520072 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:48.520021 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79997b875-c2s2f" podStartSLOduration=1.8780248099999999 podStartE2EDuration="4.520006874s" podCreationTimestamp="2026-04-17 07:55:44 +0000 UTC" firstStartedPulling="2026-04-17 07:55:45.032811593 +0000 UTC m=+204.841822307" lastFinishedPulling="2026-04-17 07:55:47.674793646 +0000 UTC m=+207.483804371" observedRunningTime="2026-04-17 07:55:48.51954798 +0000 UTC m=+208.328558709" watchObservedRunningTime="2026-04-17 07:55:48.520006874 +0000 UTC m=+208.329017603" Apr 17 07:55:49.407564 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:49.407524 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" podUID="89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 07:55:49.407720 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:49.407594 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" Apr 17 07:55:49.408099 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:49.408067 2570 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"16379e44bc63ac04b818ee565bdcdf63ec74df3e31f38ba425d1a56669c16c50"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 07:55:49.408148 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:49.408133 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" podUID="89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0" containerName="service-proxy" containerID="cri-o://16379e44bc63ac04b818ee565bdcdf63ec74df3e31f38ba425d1a56669c16c50" gracePeriod=30 Apr 17 07:55:50.095864 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:50.095816 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:50.095864 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:50.095875 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:55:50.511438 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:50.511403 2570 generic.go:358] "Generic (PLEG): container finished" podID="89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0" containerID="16379e44bc63ac04b818ee565bdcdf63ec74df3e31f38ba425d1a56669c16c50" exitCode=2 Apr 17 07:55:50.511600 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:50.511444 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" event={"ID":"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0","Type":"ContainerDied","Data":"16379e44bc63ac04b818ee565bdcdf63ec74df3e31f38ba425d1a56669c16c50"} Apr 17 07:55:50.511600 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:50.511465 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-cf899d868-wpbgh" event={"ID":"89fcabc9-f5b9-4d2b-bfa4-7af1469abbf0","Type":"ContainerStarted","Data":"c9b7f137611fb164034b09fb442affbf131216570cbc29f683f8f33160cbd51c"} Apr 17 07:55:54.912002 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:54.911952 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:54.912536 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:54.912022 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:54.916941 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:54.916918 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:55:55.530105 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:55:55.530071 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:56:10.100746 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:56:10.100717 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:56:10.104599 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:56:10.104576 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-8cf54c56c-5tfr4" Apr 17 07:56:31.538241 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:56:31.538204 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:56:31.540455 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:56:31.540435 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b124ed47-021a-4bde-8c03-dcfce0f301d8-metrics-certs\") pod \"network-metrics-daemon-7bs5q\" (UID: \"b124ed47-021a-4bde-8c03-dcfce0f301d8\") " pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:56:31.628657 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:56:31.628630 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jkb85\"" Apr 17 07:56:31.636447 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:56:31.636427 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7bs5q" Apr 17 07:56:31.749418 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:56:31.749386 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7bs5q"] Apr 17 07:56:31.752484 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:56:31.752448 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb124ed47_021a_4bde_8c03_dcfce0f301d8.slice/crio-4ba3ec0c353bc5e49bd2b61924b3157a3e16befe908aa5db4d40731483dd80b8 WatchSource:0}: Error finding container 4ba3ec0c353bc5e49bd2b61924b3157a3e16befe908aa5db4d40731483dd80b8: Status 404 returned error can't find the container with id 4ba3ec0c353bc5e49bd2b61924b3157a3e16befe908aa5db4d40731483dd80b8 Apr 17 07:56:32.623863 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:56:32.623828 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7bs5q" event={"ID":"b124ed47-021a-4bde-8c03-dcfce0f301d8","Type":"ContainerStarted","Data":"4ba3ec0c353bc5e49bd2b61924b3157a3e16befe908aa5db4d40731483dd80b8"} Apr 17 07:56:33.627789 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:56:33.627753 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7bs5q" event={"ID":"b124ed47-021a-4bde-8c03-dcfce0f301d8","Type":"ContainerStarted","Data":"b3d09fc4888ab6e8b1aad2d73eb36a26de50b465d7f6e860b58cbf2ed2753701"} Apr 17 07:56:33.627789 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:56:33.627788 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7bs5q" event={"ID":"b124ed47-021a-4bde-8c03-dcfce0f301d8","Type":"ContainerStarted","Data":"a5012ade3cc2da6062520edcbfa218035d3212c6e0461962dcc0dba90cb134fe"} Apr 17 07:56:33.644052 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:56:33.644003 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7bs5q" podStartSLOduration=252.719793829 podStartE2EDuration="4m13.643988438s" podCreationTimestamp="2026-04-17 07:52:20 +0000 UTC" firstStartedPulling="2026-04-17 07:56:31.754762549 +0000 UTC m=+251.563773258" lastFinishedPulling="2026-04-17 07:56:32.678957154 +0000 UTC m=+252.487967867" observedRunningTime="2026-04-17 07:56:33.642611924 +0000 UTC m=+253.451622663" watchObservedRunningTime="2026-04-17 07:56:33.643988438 +0000 UTC m=+253.452999519" Apr 17 07:57:04.045331 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:04.045282 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79997b875-c2s2f"] Apr 17 07:57:20.689911 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:20.689885 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 07:57:29.064412 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.064346 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-79997b875-c2s2f" podUID="f4eeb19a-4b47-4183-80d8-8e0b6967d094" containerName="console" containerID="cri-o://2ad09ae7d9408af1c4f74d1305a5e8ed21864a76e1a58fad3535e2dce451d7ab" gracePeriod=15 Apr 17 07:57:29.296033 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.296011 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79997b875-c2s2f_f4eeb19a-4b47-4183-80d8-8e0b6967d094/console/0.log" Apr 17 07:57:29.296138 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.296084 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:57:29.366681 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.366588 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-trusted-ca-bundle\") pod \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " Apr 17 07:57:29.366681 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.366630 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-oauth-serving-cert\") pod \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " Apr 17 07:57:29.366681 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.366673 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-service-ca\") pod \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " Apr 17 07:57:29.366955 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.366711 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-oauth-config\") pod \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " Apr 17 07:57:29.366955 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.366740 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-config\") pod \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " Apr 17 07:57:29.366955 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.366769 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-serving-cert\") pod \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " Apr 17 07:57:29.366955 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.366796 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddbsl\" (UniqueName: \"kubernetes.io/projected/f4eeb19a-4b47-4183-80d8-8e0b6967d094-kube-api-access-ddbsl\") pod \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\" (UID: \"f4eeb19a-4b47-4183-80d8-8e0b6967d094\") " Apr 17 07:57:29.367146 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.367086 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-service-ca" (OuterVolumeSpecName: "service-ca") pod "f4eeb19a-4b47-4183-80d8-8e0b6967d094" (UID: "f4eeb19a-4b47-4183-80d8-8e0b6967d094"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:57:29.367146 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.367099 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f4eeb19a-4b47-4183-80d8-8e0b6967d094" (UID: "f4eeb19a-4b47-4183-80d8-8e0b6967d094"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:57:29.367248 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.367145 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f4eeb19a-4b47-4183-80d8-8e0b6967d094" (UID: "f4eeb19a-4b47-4183-80d8-8e0b6967d094"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:57:29.367402 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.367375 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-config" (OuterVolumeSpecName: "console-config") pod "f4eeb19a-4b47-4183-80d8-8e0b6967d094" (UID: "f4eeb19a-4b47-4183-80d8-8e0b6967d094"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 07:57:29.369122 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.369093 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4eeb19a-4b47-4183-80d8-8e0b6967d094-kube-api-access-ddbsl" (OuterVolumeSpecName: "kube-api-access-ddbsl") pod "f4eeb19a-4b47-4183-80d8-8e0b6967d094" (UID: "f4eeb19a-4b47-4183-80d8-8e0b6967d094"). InnerVolumeSpecName "kube-api-access-ddbsl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 07:57:29.369122 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.369109 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f4eeb19a-4b47-4183-80d8-8e0b6967d094" (UID: "f4eeb19a-4b47-4183-80d8-8e0b6967d094"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:57:29.369237 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.369136 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f4eeb19a-4b47-4183-80d8-8e0b6967d094" (UID: "f4eeb19a-4b47-4183-80d8-8e0b6967d094"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 07:57:29.468012 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.467974 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-trusted-ca-bundle\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 07:57:29.468012 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.468007 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-oauth-serving-cert\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 07:57:29.468012 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.468017 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-service-ca\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 07:57:29.468248 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.468027 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-oauth-config\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 07:57:29.468248 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.468037 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-config\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 07:57:29.468248 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.468046 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4eeb19a-4b47-4183-80d8-8e0b6967d094-console-serving-cert\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 07:57:29.468248 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.468055 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ddbsl\" (UniqueName: \"kubernetes.io/projected/f4eeb19a-4b47-4183-80d8-8e0b6967d094-kube-api-access-ddbsl\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 07:57:29.791779 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.791753 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79997b875-c2s2f_f4eeb19a-4b47-4183-80d8-8e0b6967d094/console/0.log" Apr 17 07:57:29.791955 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.791792 2570 generic.go:358] "Generic (PLEG): container finished" podID="f4eeb19a-4b47-4183-80d8-8e0b6967d094" containerID="2ad09ae7d9408af1c4f74d1305a5e8ed21864a76e1a58fad3535e2dce451d7ab" exitCode=2 Apr 17 07:57:29.791955 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.791851 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79997b875-c2s2f" event={"ID":"f4eeb19a-4b47-4183-80d8-8e0b6967d094","Type":"ContainerDied","Data":"2ad09ae7d9408af1c4f74d1305a5e8ed21864a76e1a58fad3535e2dce451d7ab"} Apr 17 07:57:29.791955 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.791866 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79997b875-c2s2f" Apr 17 07:57:29.791955 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.791883 2570 scope.go:117] "RemoveContainer" containerID="2ad09ae7d9408af1c4f74d1305a5e8ed21864a76e1a58fad3535e2dce451d7ab" Apr 17 07:57:29.791955 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.791874 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79997b875-c2s2f" event={"ID":"f4eeb19a-4b47-4183-80d8-8e0b6967d094","Type":"ContainerDied","Data":"08f6623228c35d8a55f95c1428d9ced72dc53368810b991d80ba06f7fbec7b03"} Apr 17 07:57:29.800171 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.800153 2570 scope.go:117] "RemoveContainer" containerID="2ad09ae7d9408af1c4f74d1305a5e8ed21864a76e1a58fad3535e2dce451d7ab" Apr 17 07:57:29.800452 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:57:29.800431 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ad09ae7d9408af1c4f74d1305a5e8ed21864a76e1a58fad3535e2dce451d7ab\": container with ID starting with 2ad09ae7d9408af1c4f74d1305a5e8ed21864a76e1a58fad3535e2dce451d7ab not found: ID does not exist" containerID="2ad09ae7d9408af1c4f74d1305a5e8ed21864a76e1a58fad3535e2dce451d7ab" Apr 17 07:57:29.800515 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.800461 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad09ae7d9408af1c4f74d1305a5e8ed21864a76e1a58fad3535e2dce451d7ab"} err="failed to get container status \"2ad09ae7d9408af1c4f74d1305a5e8ed21864a76e1a58fad3535e2dce451d7ab\": rpc error: code = NotFound desc = could not find container \"2ad09ae7d9408af1c4f74d1305a5e8ed21864a76e1a58fad3535e2dce451d7ab\": container with ID starting with 2ad09ae7d9408af1c4f74d1305a5e8ed21864a76e1a58fad3535e2dce451d7ab not found: ID does not exist" Apr 17 07:57:29.809412 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.809385 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79997b875-c2s2f"] Apr 17 07:57:29.812121 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:29.812102 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-79997b875-c2s2f"] Apr 17 07:57:30.829906 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:57:30.829871 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4eeb19a-4b47-4183-80d8-8e0b6967d094" path="/var/lib/kubelet/pods/f4eeb19a-4b47-4183-80d8-8e0b6967d094/volumes" Apr 17 07:58:10.496556 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.496478 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-95c5b86f5-74gvv"] Apr 17 07:58:10.496996 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.496793 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4eeb19a-4b47-4183-80d8-8e0b6967d094" containerName="console" Apr 17 07:58:10.496996 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.496805 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4eeb19a-4b47-4183-80d8-8e0b6967d094" containerName="console" Apr 17 07:58:10.496996 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.496851 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4eeb19a-4b47-4183-80d8-8e0b6967d094" containerName="console" Apr 17 07:58:10.499561 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.499545 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.501907 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.501881 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 07:58:10.501907 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.501894 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 07:58:10.502701 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.502677 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 07:58:10.502701 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.502681 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-j6lrp\"" Apr 17 07:58:10.502865 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.502708 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 07:58:10.502865 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.502750 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 07:58:10.502865 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.502808 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 07:58:10.503020 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.503007 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 07:58:10.506801 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.506760 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 07:58:10.508122 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.508103 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-95c5b86f5-74gvv"] Apr 17 07:58:10.564512 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.564476 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c6e4478-2fb3-4be0-8055-71511aca685f-console-serving-cert\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.564512 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.564512 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-console-config\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.564707 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.564546 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-service-ca\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.564707 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.564628 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-oauth-serving-cert\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.564707 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.564674 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-trusted-ca-bundle\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.564798 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.564736 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtqkh\" (UniqueName: \"kubernetes.io/projected/9c6e4478-2fb3-4be0-8055-71511aca685f-kube-api-access-gtqkh\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.564798 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.564757 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c6e4478-2fb3-4be0-8055-71511aca685f-console-oauth-config\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.666003 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.665963 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtqkh\" (UniqueName: \"kubernetes.io/projected/9c6e4478-2fb3-4be0-8055-71511aca685f-kube-api-access-gtqkh\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.666003 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.666005 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c6e4478-2fb3-4be0-8055-71511aca685f-console-oauth-config\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.666167 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.666034 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c6e4478-2fb3-4be0-8055-71511aca685f-console-serving-cert\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.666167 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.666061 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-console-config\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.666167 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.666093 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-service-ca\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.666167 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.666149 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-oauth-serving-cert\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.666371 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.666177 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-trusted-ca-bundle\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.666871 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.666843 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-service-ca\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.666871 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.666867 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-oauth-serving-cert\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.667046 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.667025 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-console-config\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.667377 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.667356 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-trusted-ca-bundle\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.668647 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.668617 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c6e4478-2fb3-4be0-8055-71511aca685f-console-serving-cert\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.668760 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.668737 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c6e4478-2fb3-4be0-8055-71511aca685f-console-oauth-config\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.674075 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.674047 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtqkh\" (UniqueName: \"kubernetes.io/projected/9c6e4478-2fb3-4be0-8055-71511aca685f-kube-api-access-gtqkh\") pod \"console-95c5b86f5-74gvv\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.809797 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.809709 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:10.930244 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.930213 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-95c5b86f5-74gvv"] Apr 17 07:58:10.933582 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:58:10.933552 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c6e4478_2fb3_4be0_8055_71511aca685f.slice/crio-9b436608afbcff633a2d49507e4bbaa66b492154cddd901ad688afb41bf7fd6f WatchSource:0}: Error finding container 9b436608afbcff633a2d49507e4bbaa66b492154cddd901ad688afb41bf7fd6f: Status 404 returned error can't find the container with id 9b436608afbcff633a2d49507e4bbaa66b492154cddd901ad688afb41bf7fd6f Apr 17 07:58:10.935360 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:10.935342 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 07:58:11.902053 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:11.902019 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95c5b86f5-74gvv" event={"ID":"9c6e4478-2fb3-4be0-8055-71511aca685f","Type":"ContainerStarted","Data":"068e95c35552abf2050ad9ca5a8925d475de7d921052467bd73b202de8901073"} Apr 17 07:58:11.902053 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:11.902056 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95c5b86f5-74gvv" event={"ID":"9c6e4478-2fb3-4be0-8055-71511aca685f","Type":"ContainerStarted","Data":"9b436608afbcff633a2d49507e4bbaa66b492154cddd901ad688afb41bf7fd6f"} Apr 17 07:58:11.919927 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:11.919875 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-95c5b86f5-74gvv" podStartSLOduration=1.919858052 podStartE2EDuration="1.919858052s" podCreationTimestamp="2026-04-17 07:58:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 07:58:11.919604288 +0000 UTC m=+351.728615032" watchObservedRunningTime="2026-04-17 07:58:11.919858052 +0000 UTC m=+351.728868770" Apr 17 07:58:20.809911 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:20.809881 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:20.810254 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:20.809919 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:20.814577 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:20.814558 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:58:20.929391 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:58:20.929359 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 07:59:20.320070 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.320036 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-sqm7l"] Apr 17 07:59:20.323198 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.323180 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:20.325613 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.325586 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 07:59:20.325749 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.325626 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 07:59:20.325749 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.325623 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 07:59:20.326265 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.326248 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 07:59:20.326265 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.326257 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-k5vgx\"" Apr 17 07:59:20.326437 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.326263 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 07:59:20.331590 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.331569 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-sqm7l"] Apr 17 07:59:20.390245 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.390216 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-cabundle0\") pod \"keda-operator-ffbb595cb-sqm7l\" (UID: \"39b5f5d8-913d-4bdb-be6b-11cc60d5e532\") " pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:20.390375 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.390255 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-certificates\") pod \"keda-operator-ffbb595cb-sqm7l\" (UID: \"39b5f5d8-913d-4bdb-be6b-11cc60d5e532\") " pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:20.390375 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.390349 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z94l2\" (UniqueName: \"kubernetes.io/projected/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-kube-api-access-z94l2\") pod \"keda-operator-ffbb595cb-sqm7l\" (UID: \"39b5f5d8-913d-4bdb-be6b-11cc60d5e532\") " pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:20.491300 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.491271 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-certificates\") pod \"keda-operator-ffbb595cb-sqm7l\" (UID: \"39b5f5d8-913d-4bdb-be6b-11cc60d5e532\") " pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:20.491458 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.491338 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z94l2\" (UniqueName: \"kubernetes.io/projected/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-kube-api-access-z94l2\") pod \"keda-operator-ffbb595cb-sqm7l\" (UID: \"39b5f5d8-913d-4bdb-be6b-11cc60d5e532\") " pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:20.491458 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.491376 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-cabundle0\") pod \"keda-operator-ffbb595cb-sqm7l\" (UID: \"39b5f5d8-913d-4bdb-be6b-11cc60d5e532\") " pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:20.491458 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:59:20.491418 2570 secret.go:281] references non-existent secret key: ca.crt Apr 17 07:59:20.491458 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:59:20.491441 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 07:59:20.491458 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:59:20.491453 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-sqm7l: references non-existent secret key: ca.crt Apr 17 07:59:20.491618 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:59:20.491541 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-certificates podName:39b5f5d8-913d-4bdb-be6b-11cc60d5e532 nodeName:}" failed. No retries permitted until 2026-04-17 07:59:20.991521128 +0000 UTC m=+420.800531850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-certificates") pod "keda-operator-ffbb595cb-sqm7l" (UID: "39b5f5d8-913d-4bdb-be6b-11cc60d5e532") : references non-existent secret key: ca.crt Apr 17 07:59:20.491983 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.491965 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-cabundle0\") pod \"keda-operator-ffbb595cb-sqm7l\" (UID: \"39b5f5d8-913d-4bdb-be6b-11cc60d5e532\") " pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:20.501928 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.501902 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z94l2\" (UniqueName: \"kubernetes.io/projected/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-kube-api-access-z94l2\") pod \"keda-operator-ffbb595cb-sqm7l\" (UID: \"39b5f5d8-913d-4bdb-be6b-11cc60d5e532\") " pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:20.996233 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:20.996200 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-certificates\") pod \"keda-operator-ffbb595cb-sqm7l\" (UID: \"39b5f5d8-913d-4bdb-be6b-11cc60d5e532\") " pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:20.996428 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:59:20.996348 2570 secret.go:281] references non-existent secret key: ca.crt Apr 17 07:59:20.996428 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:59:20.996362 2570 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 07:59:20.996428 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:59:20.996371 2570 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-sqm7l: references non-existent secret key: ca.crt Apr 17 07:59:20.996428 ip-10-0-138-233 kubenswrapper[2570]: E0417 07:59:20.996430 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-certificates podName:39b5f5d8-913d-4bdb-be6b-11cc60d5e532 nodeName:}" failed. No retries permitted until 2026-04-17 07:59:21.996414203 +0000 UTC m=+421.805424915 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-certificates") pod "keda-operator-ffbb595cb-sqm7l" (UID: "39b5f5d8-913d-4bdb-be6b-11cc60d5e532") : references non-existent secret key: ca.crt Apr 17 07:59:22.006331 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:22.006280 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-certificates\") pod \"keda-operator-ffbb595cb-sqm7l\" (UID: \"39b5f5d8-913d-4bdb-be6b-11cc60d5e532\") " pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:22.008672 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:22.008644 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/39b5f5d8-913d-4bdb-be6b-11cc60d5e532-certificates\") pod \"keda-operator-ffbb595cb-sqm7l\" (UID: \"39b5f5d8-913d-4bdb-be6b-11cc60d5e532\") " pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:22.135647 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:22.135608 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-k5vgx\"" Apr 17 07:59:22.144378 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:22.144348 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:22.257951 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:22.257848 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-sqm7l"] Apr 17 07:59:22.262590 ip-10-0-138-233 kubenswrapper[2570]: W0417 07:59:22.262562 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b5f5d8_913d_4bdb_be6b_11cc60d5e532.slice/crio-63ad4dff322c44981e4c860209f9ab3b00cafc70e5dba0ee9307edffff8d8a2f WatchSource:0}: Error finding container 63ad4dff322c44981e4c860209f9ab3b00cafc70e5dba0ee9307edffff8d8a2f: Status 404 returned error can't find the container with id 63ad4dff322c44981e4c860209f9ab3b00cafc70e5dba0ee9307edffff8d8a2f Apr 17 07:59:23.087247 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:23.087201 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" event={"ID":"39b5f5d8-913d-4bdb-be6b-11cc60d5e532","Type":"ContainerStarted","Data":"63ad4dff322c44981e4c860209f9ab3b00cafc70e5dba0ee9307edffff8d8a2f"} Apr 17 07:59:26.098028 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:26.097986 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" event={"ID":"39b5f5d8-913d-4bdb-be6b-11cc60d5e532","Type":"ContainerStarted","Data":"26826f485fa4f1214c0772b78603863d91aa38e571530b3e2448bca6400390b1"} Apr 17 07:59:26.098504 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:26.098107 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 07:59:26.115331 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:26.115267 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" podStartSLOduration=3.0178295 podStartE2EDuration="6.115253463s" podCreationTimestamp="2026-04-17 07:59:20 +0000 UTC" firstStartedPulling="2026-04-17 07:59:22.263810224 +0000 UTC m=+422.072820933" lastFinishedPulling="2026-04-17 07:59:25.361234185 +0000 UTC m=+425.170244896" observedRunningTime="2026-04-17 07:59:26.114018137 +0000 UTC m=+425.923028872" watchObservedRunningTime="2026-04-17 07:59:26.115253463 +0000 UTC m=+425.924264193" Apr 17 07:59:47.103050 ip-10-0-138-233 kubenswrapper[2570]: I0417 07:59:47.102970 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-sqm7l" Apr 17 08:00:28.063708 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.063670 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-558564fd68-plplz"] Apr 17 08:00:28.066840 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.066824 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-plplz" Apr 17 08:00:28.071065 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.071035 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 08:00:28.071192 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.071116 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 08:00:28.071416 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.071398 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 08:00:28.071416 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.071410 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-fhd9b\"" Apr 17 08:00:28.086001 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.085981 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-plplz"] Apr 17 08:00:28.119692 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.119668 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-pjck4"] Apr 17 08:00:28.122682 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.122665 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-pjck4" Apr 17 08:00:28.128273 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.128251 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 08:00:28.128397 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.128364 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-77vl8\"" Apr 17 08:00:28.148493 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.148462 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-pjck4"] Apr 17 08:00:28.200163 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.200130 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlmlw\" (UniqueName: \"kubernetes.io/projected/a937ea63-07c0-4da3-a786-5a086c699abf-kube-api-access-xlmlw\") pod \"seaweedfs-86cc847c5c-pjck4\" (UID: \"a937ea63-07c0-4da3-a786-5a086c699abf\") " pod="kserve/seaweedfs-86cc847c5c-pjck4" Apr 17 08:00:28.200286 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.200172 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12682bf6-d8ea-4977-a855-64b1644c5450-cert\") pod \"kserve-controller-manager-558564fd68-plplz\" (UID: \"12682bf6-d8ea-4977-a855-64b1644c5450\") " pod="kserve/kserve-controller-manager-558564fd68-plplz" Apr 17 08:00:28.200286 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.200256 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5fjs\" (UniqueName: \"kubernetes.io/projected/12682bf6-d8ea-4977-a855-64b1644c5450-kube-api-access-x5fjs\") pod \"kserve-controller-manager-558564fd68-plplz\" (UID: \"12682bf6-d8ea-4977-a855-64b1644c5450\") " pod="kserve/kserve-controller-manager-558564fd68-plplz" Apr 17 08:00:28.200286 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.200279 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a937ea63-07c0-4da3-a786-5a086c699abf-data\") pod \"seaweedfs-86cc847c5c-pjck4\" (UID: \"a937ea63-07c0-4da3-a786-5a086c699abf\") " pod="kserve/seaweedfs-86cc847c5c-pjck4" Apr 17 08:00:28.301055 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.301025 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5fjs\" (UniqueName: \"kubernetes.io/projected/12682bf6-d8ea-4977-a855-64b1644c5450-kube-api-access-x5fjs\") pod \"kserve-controller-manager-558564fd68-plplz\" (UID: \"12682bf6-d8ea-4977-a855-64b1644c5450\") " pod="kserve/kserve-controller-manager-558564fd68-plplz" Apr 17 08:00:28.301055 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.301059 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a937ea63-07c0-4da3-a786-5a086c699abf-data\") pod \"seaweedfs-86cc847c5c-pjck4\" (UID: \"a937ea63-07c0-4da3-a786-5a086c699abf\") " pod="kserve/seaweedfs-86cc847c5c-pjck4" Apr 17 08:00:28.301285 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.301086 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlmlw\" (UniqueName: \"kubernetes.io/projected/a937ea63-07c0-4da3-a786-5a086c699abf-kube-api-access-xlmlw\") pod \"seaweedfs-86cc847c5c-pjck4\" (UID: \"a937ea63-07c0-4da3-a786-5a086c699abf\") " pod="kserve/seaweedfs-86cc847c5c-pjck4" Apr 17 08:00:28.301285 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.301115 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12682bf6-d8ea-4977-a855-64b1644c5450-cert\") pod \"kserve-controller-manager-558564fd68-plplz\" (UID: \"12682bf6-d8ea-4977-a855-64b1644c5450\") " pod="kserve/kserve-controller-manager-558564fd68-plplz" Apr 17 08:00:28.301478 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.301458 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a937ea63-07c0-4da3-a786-5a086c699abf-data\") pod \"seaweedfs-86cc847c5c-pjck4\" (UID: \"a937ea63-07c0-4da3-a786-5a086c699abf\") " pod="kserve/seaweedfs-86cc847c5c-pjck4" Apr 17 08:00:28.303480 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.303461 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12682bf6-d8ea-4977-a855-64b1644c5450-cert\") pod \"kserve-controller-manager-558564fd68-plplz\" (UID: \"12682bf6-d8ea-4977-a855-64b1644c5450\") " pod="kserve/kserve-controller-manager-558564fd68-plplz" Apr 17 08:00:28.314770 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.314717 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlmlw\" (UniqueName: \"kubernetes.io/projected/a937ea63-07c0-4da3-a786-5a086c699abf-kube-api-access-xlmlw\") pod \"seaweedfs-86cc847c5c-pjck4\" (UID: \"a937ea63-07c0-4da3-a786-5a086c699abf\") " pod="kserve/seaweedfs-86cc847c5c-pjck4" Apr 17 08:00:28.318890 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.318870 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5fjs\" (UniqueName: \"kubernetes.io/projected/12682bf6-d8ea-4977-a855-64b1644c5450-kube-api-access-x5fjs\") pod \"kserve-controller-manager-558564fd68-plplz\" (UID: \"12682bf6-d8ea-4977-a855-64b1644c5450\") " pod="kserve/kserve-controller-manager-558564fd68-plplz" Apr 17 08:00:28.376865 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.376833 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-plplz" Apr 17 08:00:28.431481 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.431456 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-pjck4" Apr 17 08:00:28.505168 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.505135 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-plplz"] Apr 17 08:00:28.509172 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:00:28.509126 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12682bf6_d8ea_4977_a855_64b1644c5450.slice/crio-df0444add19b7770b6209b5d64c7157ec50c6e23af4b2a40701dfe9bc6a6d304 WatchSource:0}: Error finding container df0444add19b7770b6209b5d64c7157ec50c6e23af4b2a40701dfe9bc6a6d304: Status 404 returned error can't find the container with id df0444add19b7770b6209b5d64c7157ec50c6e23af4b2a40701dfe9bc6a6d304 Apr 17 08:00:28.558674 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:28.558648 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-pjck4"] Apr 17 08:00:28.561550 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:00:28.561518 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda937ea63_07c0_4da3_a786_5a086c699abf.slice/crio-4667f4b8c7d80caa2fa7e45858dee6c059b091a324f203d90c8ebf2f15b38663 WatchSource:0}: Error finding container 4667f4b8c7d80caa2fa7e45858dee6c059b091a324f203d90c8ebf2f15b38663: Status 404 returned error can't find the container with id 4667f4b8c7d80caa2fa7e45858dee6c059b091a324f203d90c8ebf2f15b38663 Apr 17 08:00:29.267893 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:29.267841 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-pjck4" event={"ID":"a937ea63-07c0-4da3-a786-5a086c699abf","Type":"ContainerStarted","Data":"4667f4b8c7d80caa2fa7e45858dee6c059b091a324f203d90c8ebf2f15b38663"} Apr 17 08:00:29.270109 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:29.270075 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-plplz" event={"ID":"12682bf6-d8ea-4977-a855-64b1644c5450","Type":"ContainerStarted","Data":"df0444add19b7770b6209b5d64c7157ec50c6e23af4b2a40701dfe9bc6a6d304"} Apr 17 08:00:32.281688 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:32.281649 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-pjck4" event={"ID":"a937ea63-07c0-4da3-a786-5a086c699abf","Type":"ContainerStarted","Data":"9f5ce05e54b8fcc2f3085c6baab944cc7d0a94daec832366c44caac4c27d359a"} Apr 17 08:00:32.282123 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:32.281888 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-pjck4" Apr 17 08:00:32.283160 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:32.283126 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-plplz" event={"ID":"12682bf6-d8ea-4977-a855-64b1644c5450","Type":"ContainerStarted","Data":"aa243c5221f1e28b80126ce98546c07f16d0dfe200b3275c54ad275fa0dbdb8f"} Apr 17 08:00:32.283284 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:32.283269 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-558564fd68-plplz" Apr 17 08:00:32.297235 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:32.297183 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-pjck4" podStartSLOduration=0.702278025 podStartE2EDuration="4.297171221s" podCreationTimestamp="2026-04-17 08:00:28 +0000 UTC" firstStartedPulling="2026-04-17 08:00:28.562904546 +0000 UTC m=+488.371915254" lastFinishedPulling="2026-04-17 08:00:32.157797743 +0000 UTC m=+491.966808450" observedRunningTime="2026-04-17 08:00:32.295862412 +0000 UTC m=+492.104873143" watchObservedRunningTime="2026-04-17 08:00:32.297171221 +0000 UTC m=+492.106181950" Apr 17 08:00:32.313066 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:32.313025 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-558564fd68-plplz" podStartSLOduration=0.760566474 podStartE2EDuration="4.313012234s" podCreationTimestamp="2026-04-17 08:00:28 +0000 UTC" firstStartedPulling="2026-04-17 08:00:28.510491421 +0000 UTC m=+488.319502129" lastFinishedPulling="2026-04-17 08:00:32.062937178 +0000 UTC m=+491.871947889" observedRunningTime="2026-04-17 08:00:32.312063382 +0000 UTC m=+492.121074123" watchObservedRunningTime="2026-04-17 08:00:32.313012234 +0000 UTC m=+492.122022964" Apr 17 08:00:38.288391 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:00:38.288356 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-pjck4" Apr 17 08:01:03.291256 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:03.291222 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-558564fd68-plplz" Apr 17 08:01:10.118735 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.118657 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-plplz"] Apr 17 08:01:10.119093 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.118899 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-558564fd68-plplz" podUID="12682bf6-d8ea-4977-a855-64b1644c5450" containerName="manager" containerID="cri-o://aa243c5221f1e28b80126ce98546c07f16d0dfe200b3275c54ad275fa0dbdb8f" gracePeriod=10 Apr 17 08:01:10.145825 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.145795 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-558564fd68-d94vb"] Apr 17 08:01:10.148900 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.148883 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-d94vb" Apr 17 08:01:10.156094 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.156063 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-d94vb"] Apr 17 08:01:10.320173 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.320139 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5599dd9d-d9a4-43d4-97d4-ebc58704ae98-cert\") pod \"kserve-controller-manager-558564fd68-d94vb\" (UID: \"5599dd9d-d9a4-43d4-97d4-ebc58704ae98\") " pod="kserve/kserve-controller-manager-558564fd68-d94vb" Apr 17 08:01:10.320173 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.320173 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpj46\" (UniqueName: \"kubernetes.io/projected/5599dd9d-d9a4-43d4-97d4-ebc58704ae98-kube-api-access-zpj46\") pod \"kserve-controller-manager-558564fd68-d94vb\" (UID: \"5599dd9d-d9a4-43d4-97d4-ebc58704ae98\") " pod="kserve/kserve-controller-manager-558564fd68-d94vb" Apr 17 08:01:10.358438 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.358418 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-plplz" Apr 17 08:01:10.395757 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.395665 2570 generic.go:358] "Generic (PLEG): container finished" podID="12682bf6-d8ea-4977-a855-64b1644c5450" containerID="aa243c5221f1e28b80126ce98546c07f16d0dfe200b3275c54ad275fa0dbdb8f" exitCode=0 Apr 17 08:01:10.395757 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.395730 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-plplz" event={"ID":"12682bf6-d8ea-4977-a855-64b1644c5450","Type":"ContainerDied","Data":"aa243c5221f1e28b80126ce98546c07f16d0dfe200b3275c54ad275fa0dbdb8f"} Apr 17 08:01:10.395954 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.395764 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-plplz" Apr 17 08:01:10.395954 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.395772 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-plplz" event={"ID":"12682bf6-d8ea-4977-a855-64b1644c5450","Type":"ContainerDied","Data":"df0444add19b7770b6209b5d64c7157ec50c6e23af4b2a40701dfe9bc6a6d304"} Apr 17 08:01:10.395954 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.395794 2570 scope.go:117] "RemoveContainer" containerID="aa243c5221f1e28b80126ce98546c07f16d0dfe200b3275c54ad275fa0dbdb8f" Apr 17 08:01:10.404041 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.404023 2570 scope.go:117] "RemoveContainer" containerID="aa243c5221f1e28b80126ce98546c07f16d0dfe200b3275c54ad275fa0dbdb8f" Apr 17 08:01:10.404309 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:01:10.404289 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa243c5221f1e28b80126ce98546c07f16d0dfe200b3275c54ad275fa0dbdb8f\": container with ID starting with aa243c5221f1e28b80126ce98546c07f16d0dfe200b3275c54ad275fa0dbdb8f not found: ID does not exist" containerID="aa243c5221f1e28b80126ce98546c07f16d0dfe200b3275c54ad275fa0dbdb8f" Apr 17 08:01:10.404368 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.404338 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa243c5221f1e28b80126ce98546c07f16d0dfe200b3275c54ad275fa0dbdb8f"} err="failed to get container status \"aa243c5221f1e28b80126ce98546c07f16d0dfe200b3275c54ad275fa0dbdb8f\": rpc error: code = NotFound desc = could not find container \"aa243c5221f1e28b80126ce98546c07f16d0dfe200b3275c54ad275fa0dbdb8f\": container with ID starting with aa243c5221f1e28b80126ce98546c07f16d0dfe200b3275c54ad275fa0dbdb8f not found: ID does not exist" Apr 17 08:01:10.420788 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.420762 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5599dd9d-d9a4-43d4-97d4-ebc58704ae98-cert\") pod \"kserve-controller-manager-558564fd68-d94vb\" (UID: \"5599dd9d-d9a4-43d4-97d4-ebc58704ae98\") " pod="kserve/kserve-controller-manager-558564fd68-d94vb" Apr 17 08:01:10.420904 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.420795 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpj46\" (UniqueName: \"kubernetes.io/projected/5599dd9d-d9a4-43d4-97d4-ebc58704ae98-kube-api-access-zpj46\") pod \"kserve-controller-manager-558564fd68-d94vb\" (UID: \"5599dd9d-d9a4-43d4-97d4-ebc58704ae98\") " pod="kserve/kserve-controller-manager-558564fd68-d94vb" Apr 17 08:01:10.423193 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.423169 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5599dd9d-d9a4-43d4-97d4-ebc58704ae98-cert\") pod \"kserve-controller-manager-558564fd68-d94vb\" (UID: \"5599dd9d-d9a4-43d4-97d4-ebc58704ae98\") " pod="kserve/kserve-controller-manager-558564fd68-d94vb" Apr 17 08:01:10.428517 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.428494 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpj46\" (UniqueName: \"kubernetes.io/projected/5599dd9d-d9a4-43d4-97d4-ebc58704ae98-kube-api-access-zpj46\") pod \"kserve-controller-manager-558564fd68-d94vb\" (UID: \"5599dd9d-d9a4-43d4-97d4-ebc58704ae98\") " pod="kserve/kserve-controller-manager-558564fd68-d94vb" Apr 17 08:01:10.499936 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.499897 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-558564fd68-d94vb" Apr 17 08:01:10.521743 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.521715 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12682bf6-d8ea-4977-a855-64b1644c5450-cert\") pod \"12682bf6-d8ea-4977-a855-64b1644c5450\" (UID: \"12682bf6-d8ea-4977-a855-64b1644c5450\") " Apr 17 08:01:10.521870 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.521749 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5fjs\" (UniqueName: \"kubernetes.io/projected/12682bf6-d8ea-4977-a855-64b1644c5450-kube-api-access-x5fjs\") pod \"12682bf6-d8ea-4977-a855-64b1644c5450\" (UID: \"12682bf6-d8ea-4977-a855-64b1644c5450\") " Apr 17 08:01:10.523897 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.523870 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12682bf6-d8ea-4977-a855-64b1644c5450-kube-api-access-x5fjs" (OuterVolumeSpecName: "kube-api-access-x5fjs") pod "12682bf6-d8ea-4977-a855-64b1644c5450" (UID: "12682bf6-d8ea-4977-a855-64b1644c5450"). InnerVolumeSpecName "kube-api-access-x5fjs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:01:10.523993 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.523876 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12682bf6-d8ea-4977-a855-64b1644c5450-cert" (OuterVolumeSpecName: "cert") pod "12682bf6-d8ea-4977-a855-64b1644c5450" (UID: "12682bf6-d8ea-4977-a855-64b1644c5450"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:01:10.615113 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.615076 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-d94vb"] Apr 17 08:01:10.618012 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:01:10.617982 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5599dd9d_d9a4_43d4_97d4_ebc58704ae98.slice/crio-7ddfeb1de1d11656f8293fb323f1c3157cf2c566345a772e1fa40a9747f28869 WatchSource:0}: Error finding container 7ddfeb1de1d11656f8293fb323f1c3157cf2c566345a772e1fa40a9747f28869: Status 404 returned error can't find the container with id 7ddfeb1de1d11656f8293fb323f1c3157cf2c566345a772e1fa40a9747f28869 Apr 17 08:01:10.622663 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.622645 2570 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12682bf6-d8ea-4977-a855-64b1644c5450-cert\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 08:01:10.622721 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.622666 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x5fjs\" (UniqueName: \"kubernetes.io/projected/12682bf6-d8ea-4977-a855-64b1644c5450-kube-api-access-x5fjs\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 08:01:10.716088 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.716051 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-plplz"] Apr 17 08:01:10.720159 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.720135 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-558564fd68-plplz"] Apr 17 08:01:10.828459 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:10.828424 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12682bf6-d8ea-4977-a855-64b1644c5450" path="/var/lib/kubelet/pods/12682bf6-d8ea-4977-a855-64b1644c5450/volumes" Apr 17 08:01:11.399840 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:11.399804 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-d94vb" event={"ID":"5599dd9d-d9a4-43d4-97d4-ebc58704ae98","Type":"ContainerStarted","Data":"f25930ca796353e3a5dcfa36fe6a2655fd73b400194d75bfb9a2da4cd227e24d"} Apr 17 08:01:11.400281 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:11.399839 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-558564fd68-d94vb" event={"ID":"5599dd9d-d9a4-43d4-97d4-ebc58704ae98","Type":"ContainerStarted","Data":"7ddfeb1de1d11656f8293fb323f1c3157cf2c566345a772e1fa40a9747f28869"} Apr 17 08:01:11.400281 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:11.399890 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-558564fd68-d94vb" Apr 17 08:01:11.415472 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:11.415425 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-558564fd68-d94vb" podStartSLOduration=0.943432847 podStartE2EDuration="1.41540672s" podCreationTimestamp="2026-04-17 08:01:10 +0000 UTC" firstStartedPulling="2026-04-17 08:01:10.619070422 +0000 UTC m=+530.428081130" lastFinishedPulling="2026-04-17 08:01:11.09104428 +0000 UTC m=+530.900055003" observedRunningTime="2026-04-17 08:01:11.414437723 +0000 UTC m=+531.223448452" watchObservedRunningTime="2026-04-17 08:01:11.41540672 +0000 UTC m=+531.224417450" Apr 17 08:01:42.409145 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:42.409114 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-558564fd68-d94vb" Apr 17 08:01:43.288007 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.287972 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-6w5g2"] Apr 17 08:01:43.288282 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.288269 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12682bf6-d8ea-4977-a855-64b1644c5450" containerName="manager" Apr 17 08:01:43.288350 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.288284 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="12682bf6-d8ea-4977-a855-64b1644c5450" containerName="manager" Apr 17 08:01:43.288387 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.288358 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="12682bf6-d8ea-4977-a855-64b1644c5450" containerName="manager" Apr 17 08:01:43.291098 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.291082 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6w5g2" Apr 17 08:01:43.293421 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.293398 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 08:01:43.293558 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.293427 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-b2ctw\"" Apr 17 08:01:43.300356 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.300332 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6w5g2"] Apr 17 08:01:43.360006 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.359966 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee5a709a-57ed-4987-8fbb-f46c9e961e33-cert\") pod \"odh-model-controller-696fc77849-6w5g2\" (UID: \"ee5a709a-57ed-4987-8fbb-f46c9e961e33\") " pod="kserve/odh-model-controller-696fc77849-6w5g2" Apr 17 08:01:43.360006 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.360006 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd26q\" (UniqueName: \"kubernetes.io/projected/ee5a709a-57ed-4987-8fbb-f46c9e961e33-kube-api-access-xd26q\") pod \"odh-model-controller-696fc77849-6w5g2\" (UID: \"ee5a709a-57ed-4987-8fbb-f46c9e961e33\") " pod="kserve/odh-model-controller-696fc77849-6w5g2" Apr 17 08:01:43.461153 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.461112 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee5a709a-57ed-4987-8fbb-f46c9e961e33-cert\") pod \"odh-model-controller-696fc77849-6w5g2\" (UID: \"ee5a709a-57ed-4987-8fbb-f46c9e961e33\") " pod="kserve/odh-model-controller-696fc77849-6w5g2" Apr 17 08:01:43.461153 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.461150 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xd26q\" (UniqueName: \"kubernetes.io/projected/ee5a709a-57ed-4987-8fbb-f46c9e961e33-kube-api-access-xd26q\") pod \"odh-model-controller-696fc77849-6w5g2\" (UID: \"ee5a709a-57ed-4987-8fbb-f46c9e961e33\") " pod="kserve/odh-model-controller-696fc77849-6w5g2" Apr 17 08:01:43.461704 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:01:43.461334 2570 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 08:01:43.461704 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:01:43.461457 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee5a709a-57ed-4987-8fbb-f46c9e961e33-cert podName:ee5a709a-57ed-4987-8fbb-f46c9e961e33 nodeName:}" failed. No retries permitted until 2026-04-17 08:01:43.961432427 +0000 UTC m=+563.770443135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ee5a709a-57ed-4987-8fbb-f46c9e961e33-cert") pod "odh-model-controller-696fc77849-6w5g2" (UID: "ee5a709a-57ed-4987-8fbb-f46c9e961e33") : secret "odh-model-controller-webhook-cert" not found Apr 17 08:01:43.472701 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.472676 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd26q\" (UniqueName: \"kubernetes.io/projected/ee5a709a-57ed-4987-8fbb-f46c9e961e33-kube-api-access-xd26q\") pod \"odh-model-controller-696fc77849-6w5g2\" (UID: \"ee5a709a-57ed-4987-8fbb-f46c9e961e33\") " pod="kserve/odh-model-controller-696fc77849-6w5g2" Apr 17 08:01:43.964574 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.964541 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee5a709a-57ed-4987-8fbb-f46c9e961e33-cert\") pod \"odh-model-controller-696fc77849-6w5g2\" (UID: \"ee5a709a-57ed-4987-8fbb-f46c9e961e33\") " pod="kserve/odh-model-controller-696fc77849-6w5g2" Apr 17 08:01:43.966996 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:43.966960 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee5a709a-57ed-4987-8fbb-f46c9e961e33-cert\") pod \"odh-model-controller-696fc77849-6w5g2\" (UID: \"ee5a709a-57ed-4987-8fbb-f46c9e961e33\") " pod="kserve/odh-model-controller-696fc77849-6w5g2" Apr 17 08:01:44.202668 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:44.202619 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6w5g2" Apr 17 08:01:44.316199 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:44.316118 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6w5g2"] Apr 17 08:01:44.318848 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:01:44.318817 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee5a709a_57ed_4987_8fbb_f46c9e961e33.slice/crio-41c10b2e2d5f8d61f9e4a600dcd1edb5d888ce0815ddeab1097ba2b71b364f64 WatchSource:0}: Error finding container 41c10b2e2d5f8d61f9e4a600dcd1edb5d888ce0815ddeab1097ba2b71b364f64: Status 404 returned error can't find the container with id 41c10b2e2d5f8d61f9e4a600dcd1edb5d888ce0815ddeab1097ba2b71b364f64 Apr 17 08:01:44.493572 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:44.493532 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6w5g2" event={"ID":"ee5a709a-57ed-4987-8fbb-f46c9e961e33","Type":"ContainerStarted","Data":"41c10b2e2d5f8d61f9e4a600dcd1edb5d888ce0815ddeab1097ba2b71b364f64"} Apr 17 08:01:47.504020 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:47.503985 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6w5g2" event={"ID":"ee5a709a-57ed-4987-8fbb-f46c9e961e33","Type":"ContainerStarted","Data":"1f0dd101e38820805276a3f51e5a4ad4cee502142a88a2f7ee30e3ebf6eacf8e"} Apr 17 08:01:47.504478 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:47.504126 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-6w5g2" Apr 17 08:01:47.523081 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:47.523032 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-6w5g2" podStartSLOduration=1.9849709679999998 podStartE2EDuration="4.523015991s" podCreationTimestamp="2026-04-17 08:01:43 +0000 UTC" firstStartedPulling="2026-04-17 08:01:44.320038387 +0000 UTC m=+564.129049099" lastFinishedPulling="2026-04-17 08:01:46.858083411 +0000 UTC m=+566.667094122" observedRunningTime="2026-04-17 08:01:47.520935241 +0000 UTC m=+567.329945983" watchObservedRunningTime="2026-04-17 08:01:47.523015991 +0000 UTC m=+567.332026722" Apr 17 08:01:54.729899 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.729865 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f67f789-dvtdm"] Apr 17 08:01:54.732733 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.732717 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.780846 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.780814 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f67f789-dvtdm"] Apr 17 08:01:54.854861 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.854822 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c35a3b2d-f227-49e1-88e9-12fd203c04d8-trusted-ca-bundle\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.854861 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.854865 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw4mz\" (UniqueName: \"kubernetes.io/projected/c35a3b2d-f227-49e1-88e9-12fd203c04d8-kube-api-access-jw4mz\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.855104 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.854920 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c35a3b2d-f227-49e1-88e9-12fd203c04d8-service-ca\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.855104 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.854961 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c35a3b2d-f227-49e1-88e9-12fd203c04d8-console-oauth-config\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.855104 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.854988 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c35a3b2d-f227-49e1-88e9-12fd203c04d8-console-serving-cert\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.855104 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.855043 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c35a3b2d-f227-49e1-88e9-12fd203c04d8-console-config\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.855291 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.855133 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c35a3b2d-f227-49e1-88e9-12fd203c04d8-oauth-serving-cert\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.956211 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.956180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c35a3b2d-f227-49e1-88e9-12fd203c04d8-service-ca\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.956211 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.956214 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c35a3b2d-f227-49e1-88e9-12fd203c04d8-console-oauth-config\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.956473 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.956238 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c35a3b2d-f227-49e1-88e9-12fd203c04d8-console-serving-cert\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.956473 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.956263 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c35a3b2d-f227-49e1-88e9-12fd203c04d8-console-config\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.956473 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.956308 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c35a3b2d-f227-49e1-88e9-12fd203c04d8-oauth-serving-cert\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.956473 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.956351 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c35a3b2d-f227-49e1-88e9-12fd203c04d8-trusted-ca-bundle\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.956473 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.956378 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw4mz\" (UniqueName: \"kubernetes.io/projected/c35a3b2d-f227-49e1-88e9-12fd203c04d8-kube-api-access-jw4mz\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.957010 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.956984 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c35a3b2d-f227-49e1-88e9-12fd203c04d8-console-config\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.957108 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.956991 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c35a3b2d-f227-49e1-88e9-12fd203c04d8-service-ca\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.957166 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.957115 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c35a3b2d-f227-49e1-88e9-12fd203c04d8-oauth-serving-cert\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.957268 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.957249 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c35a3b2d-f227-49e1-88e9-12fd203c04d8-trusted-ca-bundle\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.958747 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.958719 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c35a3b2d-f227-49e1-88e9-12fd203c04d8-console-serving-cert\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.958896 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.958877 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c35a3b2d-f227-49e1-88e9-12fd203c04d8-console-oauth-config\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:54.965239 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:54.965219 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw4mz\" (UniqueName: \"kubernetes.io/projected/c35a3b2d-f227-49e1-88e9-12fd203c04d8-kube-api-access-jw4mz\") pod \"console-5f67f789-dvtdm\" (UID: \"c35a3b2d-f227-49e1-88e9-12fd203c04d8\") " pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:55.041325 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:55.041234 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:01:55.165904 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:55.165877 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f67f789-dvtdm"] Apr 17 08:01:55.168545 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:01:55.168519 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc35a3b2d_f227_49e1_88e9_12fd203c04d8.slice/crio-66d561bab38a6e74d4c40be973da5bf30abb2750e81979889b2b8e754d4d4837 WatchSource:0}: Error finding container 66d561bab38a6e74d4c40be973da5bf30abb2750e81979889b2b8e754d4d4837: Status 404 returned error can't find the container with id 66d561bab38a6e74d4c40be973da5bf30abb2750e81979889b2b8e754d4d4837 Apr 17 08:01:55.527580 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:55.527543 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f67f789-dvtdm" event={"ID":"c35a3b2d-f227-49e1-88e9-12fd203c04d8","Type":"ContainerStarted","Data":"2cb87222bc188a031317769cf036fdcd016a1f5642ef1818952212d921ee80d3"} Apr 17 08:01:55.527580 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:55.527583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f67f789-dvtdm" event={"ID":"c35a3b2d-f227-49e1-88e9-12fd203c04d8","Type":"ContainerStarted","Data":"66d561bab38a6e74d4c40be973da5bf30abb2750e81979889b2b8e754d4d4837"} Apr 17 08:01:55.548759 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:55.548703 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f67f789-dvtdm" podStartSLOduration=1.548686845 podStartE2EDuration="1.548686845s" podCreationTimestamp="2026-04-17 08:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:01:55.547994229 +0000 UTC m=+575.357004958" watchObservedRunningTime="2026-04-17 08:01:55.548686845 +0000 UTC m=+575.357697577" Apr 17 08:01:58.509734 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:01:58.509702 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-6w5g2" Apr 17 08:02:05.041767 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:05.041719 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:02:05.042168 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:05.041813 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:02:05.046449 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:05.046428 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:02:05.561141 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:05.561114 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f67f789-dvtdm" Apr 17 08:02:05.608637 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:05.608596 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-95c5b86f5-74gvv"] Apr 17 08:02:18.906150 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:18.906114 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz"] Apr 17 08:02:18.913890 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:18.913872 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" Apr 17 08:02:18.916393 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:18.916174 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-6w6b9\"" Apr 17 08:02:18.917512 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:18.917490 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz"] Apr 17 08:02:19.065618 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.065582 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dac5afcb-a1b8-43e1-8ffa-f1579954ac75-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-69d5d56664-444sz\" (UID: \"dac5afcb-a1b8-43e1-8ffa-f1579954ac75\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" Apr 17 08:02:19.166825 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.166736 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dac5afcb-a1b8-43e1-8ffa-f1579954ac75-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-69d5d56664-444sz\" (UID: \"dac5afcb-a1b8-43e1-8ffa-f1579954ac75\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" Apr 17 08:02:19.167153 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.167131 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dac5afcb-a1b8-43e1-8ffa-f1579954ac75-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-69d5d56664-444sz\" (UID: \"dac5afcb-a1b8-43e1-8ffa-f1579954ac75\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" Apr 17 08:02:19.174868 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.174841 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs"] Apr 17 08:02:19.182890 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.182561 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5"] Apr 17 08:02:19.182890 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.182701 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" Apr 17 08:02:19.187755 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.187726 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs"] Apr 17 08:02:19.187880 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.187826 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" Apr 17 08:02:19.193281 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.193255 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5"] Apr 17 08:02:19.197480 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.197460 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" Apr 17 08:02:19.224756 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.224723 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" Apr 17 08:02:19.342464 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.342434 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs"] Apr 17 08:02:19.346491 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:02:19.346449 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc14a4091_fd8e_410c_a9bc_8a3dd336f24f.slice/crio-ad5d2575bdca03cbb69804625673ae97ad743a6451f460fa677e310da51d3046 WatchSource:0}: Error finding container ad5d2575bdca03cbb69804625673ae97ad743a6451f460fa677e310da51d3046: Status 404 returned error can't find the container with id ad5d2575bdca03cbb69804625673ae97ad743a6451f460fa677e310da51d3046 Apr 17 08:02:19.368883 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.368851 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0e16d2-e569-49ed-8a3c-387d722dbe93-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-zwcb5\" (UID: \"ed0e16d2-e569-49ed-8a3c-387d722dbe93\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" Apr 17 08:02:19.375799 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.375775 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn"] Apr 17 08:02:19.381012 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.380985 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz"] Apr 17 08:02:19.381112 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.381101 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" Apr 17 08:02:19.382045 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:02:19.382024 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddac5afcb_a1b8_43e1_8ffa_f1579954ac75.slice/crio-26e209bc05b0b042a7955043f1939e4a96efbe54122336d29c43225988c8cdbc WatchSource:0}: Error finding container 26e209bc05b0b042a7955043f1939e4a96efbe54122336d29c43225988c8cdbc: Status 404 returned error can't find the container with id 26e209bc05b0b042a7955043f1939e4a96efbe54122336d29c43225988c8cdbc Apr 17 08:02:19.387911 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.387888 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn"] Apr 17 08:02:19.470018 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.469940 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9641e509-6813-4e2f-97d4-efc806374a09-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn\" (UID: \"9641e509-6813-4e2f-97d4-efc806374a09\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" Apr 17 08:02:19.470018 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.469989 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0e16d2-e569-49ed-8a3c-387d722dbe93-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-zwcb5\" (UID: \"ed0e16d2-e569-49ed-8a3c-387d722dbe93\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" Apr 17 08:02:19.470295 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.470279 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0e16d2-e569-49ed-8a3c-387d722dbe93-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-zwcb5\" (UID: \"ed0e16d2-e569-49ed-8a3c-387d722dbe93\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" Apr 17 08:02:19.508030 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.508003 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" Apr 17 08:02:19.570881 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.570851 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9641e509-6813-4e2f-97d4-efc806374a09-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn\" (UID: \"9641e509-6813-4e2f-97d4-efc806374a09\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" Apr 17 08:02:19.571202 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.571180 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9641e509-6813-4e2f-97d4-efc806374a09-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn\" (UID: \"9641e509-6813-4e2f-97d4-efc806374a09\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" Apr 17 08:02:19.598531 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.598490 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" event={"ID":"dac5afcb-a1b8-43e1-8ffa-f1579954ac75","Type":"ContainerStarted","Data":"26e209bc05b0b042a7955043f1939e4a96efbe54122336d29c43225988c8cdbc"} Apr 17 08:02:19.600095 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.599998 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" event={"ID":"c14a4091-fd8e-410c-a9bc-8a3dd336f24f","Type":"ContainerStarted","Data":"ad5d2575bdca03cbb69804625673ae97ad743a6451f460fa677e310da51d3046"} Apr 17 08:02:19.624660 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.624637 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5"] Apr 17 08:02:19.627233 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:02:19.627205 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded0e16d2_e569_49ed_8a3c_387d722dbe93.slice/crio-a2b4dd1dbc8957e49395163df3dd026df477cec0e40d4599125816c6f9e09f13 WatchSource:0}: Error finding container a2b4dd1dbc8957e49395163df3dd026df477cec0e40d4599125816c6f9e09f13: Status 404 returned error can't find the container with id a2b4dd1dbc8957e49395163df3dd026df477cec0e40d4599125816c6f9e09f13 Apr 17 08:02:19.694289 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.694263 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" Apr 17 08:02:19.813500 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:19.813465 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn"] Apr 17 08:02:19.816279 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:02:19.816251 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9641e509_6813_4e2f_97d4_efc806374a09.slice/crio-c3a1976909fdb58b3ffeca0db608a9832c82fc88287d873b3c4e63c7d4008e1c WatchSource:0}: Error finding container c3a1976909fdb58b3ffeca0db608a9832c82fc88287d873b3c4e63c7d4008e1c: Status 404 returned error can't find the container with id c3a1976909fdb58b3ffeca0db608a9832c82fc88287d873b3c4e63c7d4008e1c Apr 17 08:02:20.605989 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:20.605907 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" event={"ID":"ed0e16d2-e569-49ed-8a3c-387d722dbe93","Type":"ContainerStarted","Data":"a2b4dd1dbc8957e49395163df3dd026df477cec0e40d4599125816c6f9e09f13"} Apr 17 08:02:20.612215 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:20.612183 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" event={"ID":"9641e509-6813-4e2f-97d4-efc806374a09","Type":"ContainerStarted","Data":"c3a1976909fdb58b3ffeca0db608a9832c82fc88287d873b3c4e63c7d4008e1c"} Apr 17 08:02:25.632583 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:25.632536 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" event={"ID":"dac5afcb-a1b8-43e1-8ffa-f1579954ac75","Type":"ContainerStarted","Data":"ac22ed9ed9cc4e31121ff083286da65189a50d84a770943ae6f0ace9c5b672c7"} Apr 17 08:02:25.633758 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:25.633734 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" event={"ID":"9641e509-6813-4e2f-97d4-efc806374a09","Type":"ContainerStarted","Data":"31449d9bbbffb6239083493998c52fecf413f60faab9fd14bcdd43b0862792cf"} Apr 17 08:02:25.634960 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:25.634935 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" event={"ID":"ed0e16d2-e569-49ed-8a3c-387d722dbe93","Type":"ContainerStarted","Data":"690e4548f54fdc18db2de0030319c6826219469a9f1f31666d4c117b81bad5f3"} Apr 17 08:02:29.653006 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:29.652968 2570 generic.go:358] "Generic (PLEG): container finished" podID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerID="ac22ed9ed9cc4e31121ff083286da65189a50d84a770943ae6f0ace9c5b672c7" exitCode=0 Apr 17 08:02:29.653513 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:29.653045 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" event={"ID":"dac5afcb-a1b8-43e1-8ffa-f1579954ac75","Type":"ContainerDied","Data":"ac22ed9ed9cc4e31121ff083286da65189a50d84a770943ae6f0ace9c5b672c7"} Apr 17 08:02:30.631461 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:30.631417 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-95c5b86f5-74gvv" podUID="9c6e4478-2fb3-4be0-8055-71511aca685f" containerName="console" containerID="cri-o://068e95c35552abf2050ad9ca5a8925d475de7d921052467bd73b202de8901073" gracePeriod=15 Apr 17 08:02:30.926938 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:30.926852 2570 patch_prober.go:28] interesting pod/console-95c5b86f5-74gvv container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.133.0.16:8443/health\": dial tcp 10.133.0.16:8443: connect: connection refused" start-of-body= Apr 17 08:02:30.926938 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:30.926922 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-95c5b86f5-74gvv" podUID="9c6e4478-2fb3-4be0-8055-71511aca685f" containerName="console" probeResult="failure" output="Get \"https://10.133.0.16:8443/health\": dial tcp 10.133.0.16:8443: connect: connection refused" Apr 17 08:02:31.661263 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:31.661226 2570 generic.go:358] "Generic (PLEG): container finished" podID="9641e509-6813-4e2f-97d4-efc806374a09" containerID="31449d9bbbffb6239083493998c52fecf413f60faab9fd14bcdd43b0862792cf" exitCode=0 Apr 17 08:02:31.661476 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:31.661305 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" event={"ID":"9641e509-6813-4e2f-97d4-efc806374a09","Type":"ContainerDied","Data":"31449d9bbbffb6239083493998c52fecf413f60faab9fd14bcdd43b0862792cf"} Apr 17 08:02:31.663328 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:31.663291 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-95c5b86f5-74gvv_9c6e4478-2fb3-4be0-8055-71511aca685f/console/0.log" Apr 17 08:02:31.663419 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:31.663349 2570 generic.go:358] "Generic (PLEG): container finished" podID="9c6e4478-2fb3-4be0-8055-71511aca685f" containerID="068e95c35552abf2050ad9ca5a8925d475de7d921052467bd73b202de8901073" exitCode=2 Apr 17 08:02:31.663419 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:31.663413 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95c5b86f5-74gvv" event={"ID":"9c6e4478-2fb3-4be0-8055-71511aca685f","Type":"ContainerDied","Data":"068e95c35552abf2050ad9ca5a8925d475de7d921052467bd73b202de8901073"} Apr 17 08:02:31.665362 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:31.665338 2570 generic.go:358] "Generic (PLEG): container finished" podID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerID="690e4548f54fdc18db2de0030319c6826219469a9f1f31666d4c117b81bad5f3" exitCode=0 Apr 17 08:02:31.665443 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:31.665374 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" event={"ID":"ed0e16d2-e569-49ed-8a3c-387d722dbe93","Type":"ContainerDied","Data":"690e4548f54fdc18db2de0030319c6826219469a9f1f31666d4c117b81bad5f3"} Apr 17 08:02:32.591465 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.591439 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-95c5b86f5-74gvv_9c6e4478-2fb3-4be0-8055-71511aca685f/console/0.log" Apr 17 08:02:32.591848 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.591520 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 08:02:32.671207 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.671168 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" event={"ID":"c14a4091-fd8e-410c-a9bc-8a3dd336f24f","Type":"ContainerStarted","Data":"2a3d2df3d9c2da522702be1a6c83d11e83ea9662a4140171cd43e599d16d7122"} Apr 17 08:02:32.671622 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.671601 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" Apr 17 08:02:32.672884 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.672858 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" podUID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 17 08:02:32.673680 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.673652 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-95c5b86f5-74gvv_9c6e4478-2fb3-4be0-8055-71511aca685f/console/0.log" Apr 17 08:02:32.673762 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.673709 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95c5b86f5-74gvv" event={"ID":"9c6e4478-2fb3-4be0-8055-71511aca685f","Type":"ContainerDied","Data":"9b436608afbcff633a2d49507e4bbaa66b492154cddd901ad688afb41bf7fd6f"} Apr 17 08:02:32.673820 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.673760 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95c5b86f5-74gvv" Apr 17 08:02:32.673820 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.673737 2570 scope.go:117] "RemoveContainer" containerID="068e95c35552abf2050ad9ca5a8925d475de7d921052467bd73b202de8901073" Apr 17 08:02:32.679831 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.679806 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-service-ca\") pod \"9c6e4478-2fb3-4be0-8055-71511aca685f\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " Apr 17 08:02:32.679931 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.679855 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtqkh\" (UniqueName: \"kubernetes.io/projected/9c6e4478-2fb3-4be0-8055-71511aca685f-kube-api-access-gtqkh\") pod \"9c6e4478-2fb3-4be0-8055-71511aca685f\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " Apr 17 08:02:32.680169 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.680143 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c6e4478-2fb3-4be0-8055-71511aca685f-console-oauth-config\") pod \"9c6e4478-2fb3-4be0-8055-71511aca685f\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " Apr 17 08:02:32.680240 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.680214 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-console-config\") pod \"9c6e4478-2fb3-4be0-8055-71511aca685f\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " Apr 17 08:02:32.680292 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.680245 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-trusted-ca-bundle\") pod \"9c6e4478-2fb3-4be0-8055-71511aca685f\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " Apr 17 08:02:32.680419 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.680399 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c6e4478-2fb3-4be0-8055-71511aca685f-console-serving-cert\") pod \"9c6e4478-2fb3-4be0-8055-71511aca685f\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " Apr 17 08:02:32.680484 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.680440 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-oauth-serving-cert\") pod \"9c6e4478-2fb3-4be0-8055-71511aca685f\" (UID: \"9c6e4478-2fb3-4be0-8055-71511aca685f\") " Apr 17 08:02:32.680616 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.680594 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-service-ca" (OuterVolumeSpecName: "service-ca") pod "9c6e4478-2fb3-4be0-8055-71511aca685f" (UID: "9c6e4478-2fb3-4be0-8055-71511aca685f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:02:32.680692 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.680678 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-service-ca\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 08:02:32.680992 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.680970 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9c6e4478-2fb3-4be0-8055-71511aca685f" (UID: "9c6e4478-2fb3-4be0-8055-71511aca685f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:02:32.681074 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.681022 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-console-config" (OuterVolumeSpecName: "console-config") pod "9c6e4478-2fb3-4be0-8055-71511aca685f" (UID: "9c6e4478-2fb3-4be0-8055-71511aca685f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:02:32.681139 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.681120 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9c6e4478-2fb3-4be0-8055-71511aca685f" (UID: "9c6e4478-2fb3-4be0-8055-71511aca685f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:02:32.683339 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.683278 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c6e4478-2fb3-4be0-8055-71511aca685f-kube-api-access-gtqkh" (OuterVolumeSpecName: "kube-api-access-gtqkh") pod "9c6e4478-2fb3-4be0-8055-71511aca685f" (UID: "9c6e4478-2fb3-4be0-8055-71511aca685f"). InnerVolumeSpecName "kube-api-access-gtqkh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:02:32.684342 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.684168 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c6e4478-2fb3-4be0-8055-71511aca685f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9c6e4478-2fb3-4be0-8055-71511aca685f" (UID: "9c6e4478-2fb3-4be0-8055-71511aca685f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:02:32.684342 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.684287 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c6e4478-2fb3-4be0-8055-71511aca685f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9c6e4478-2fb3-4be0-8055-71511aca685f" (UID: "9c6e4478-2fb3-4be0-8055-71511aca685f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:02:32.687878 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.687708 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" podStartSLOduration=0.535205301 podStartE2EDuration="13.687691758s" podCreationTimestamp="2026-04-17 08:02:19 +0000 UTC" firstStartedPulling="2026-04-17 08:02:19.348683617 +0000 UTC m=+599.157694328" lastFinishedPulling="2026-04-17 08:02:32.501170075 +0000 UTC m=+612.310180785" observedRunningTime="2026-04-17 08:02:32.687166708 +0000 UTC m=+612.496177436" watchObservedRunningTime="2026-04-17 08:02:32.687691758 +0000 UTC m=+612.496702491" Apr 17 08:02:32.781917 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.781883 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-console-config\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 08:02:32.781917 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.781918 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-trusted-ca-bundle\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 08:02:32.782160 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.781935 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c6e4478-2fb3-4be0-8055-71511aca685f-console-serving-cert\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 08:02:32.782160 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.781949 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c6e4478-2fb3-4be0-8055-71511aca685f-oauth-serving-cert\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 08:02:32.782160 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.781964 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gtqkh\" (UniqueName: \"kubernetes.io/projected/9c6e4478-2fb3-4be0-8055-71511aca685f-kube-api-access-gtqkh\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 08:02:32.782160 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.781978 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c6e4478-2fb3-4be0-8055-71511aca685f-console-oauth-config\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 08:02:32.998758 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:32.997186 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-95c5b86f5-74gvv"] Apr 17 08:02:33.000672 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:33.000643 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-95c5b86f5-74gvv"] Apr 17 08:02:33.681245 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:33.681151 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" podUID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 17 08:02:34.830977 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:34.830940 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c6e4478-2fb3-4be0-8055-71511aca685f" path="/var/lib/kubelet/pods/9c6e4478-2fb3-4be0-8055-71511aca685f/volumes" Apr 17 08:02:39.706496 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:39.706454 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" event={"ID":"9641e509-6813-4e2f-97d4-efc806374a09","Type":"ContainerStarted","Data":"6fd20bff3e667c96204aa94c89290055f373160ec5c010fbffc15bd6276bb7c3"} Apr 17 08:02:39.706957 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:39.706915 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" Apr 17 08:02:39.708440 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:39.708397 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 17 08:02:39.709028 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:39.709007 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" event={"ID":"dac5afcb-a1b8-43e1-8ffa-f1579954ac75","Type":"ContainerStarted","Data":"9fe3a62f8d08a3e4b58cb3bec402622a5996a28efa178a39f7d5bf132958de4b"} Apr 17 08:02:39.709337 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:39.709296 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" Apr 17 08:02:39.710375 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:39.710340 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 08:02:39.724533 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:39.724492 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" podStartSLOduration=1.376617006 podStartE2EDuration="20.72447867s" podCreationTimestamp="2026-04-17 08:02:19 +0000 UTC" firstStartedPulling="2026-04-17 08:02:19.818097797 +0000 UTC m=+599.627108505" lastFinishedPulling="2026-04-17 08:02:39.165959454 +0000 UTC m=+618.974970169" observedRunningTime="2026-04-17 08:02:39.721966496 +0000 UTC m=+619.530977226" watchObservedRunningTime="2026-04-17 08:02:39.72447867 +0000 UTC m=+619.533489399" Apr 17 08:02:39.739588 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:39.739542 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" podStartSLOduration=1.9740591429999998 podStartE2EDuration="21.739528821s" podCreationTimestamp="2026-04-17 08:02:18 +0000 UTC" firstStartedPulling="2026-04-17 08:02:19.38405381 +0000 UTC m=+599.193064518" lastFinishedPulling="2026-04-17 08:02:39.149523481 +0000 UTC m=+618.958534196" observedRunningTime="2026-04-17 08:02:39.738981384 +0000 UTC m=+619.547992127" watchObservedRunningTime="2026-04-17 08:02:39.739528821 +0000 UTC m=+619.548539551" Apr 17 08:02:40.712830 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:40.712772 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 17 08:02:40.713277 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:40.712997 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 08:02:43.682050 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:43.682003 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" podUID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 17 08:02:50.713729 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:50.713682 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 08:02:50.714273 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:50.713683 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 17 08:02:53.681734 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:53.681687 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" podUID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 17 08:02:54.759634 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:54.759594 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" event={"ID":"ed0e16d2-e569-49ed-8a3c-387d722dbe93","Type":"ContainerStarted","Data":"27319a78086a02b36594c0dc4b11b3414d4505d94f06aa7a942f56d9b501f3ba"} Apr 17 08:02:54.760081 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:54.759948 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" Apr 17 08:02:54.761424 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:54.761390 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 17 08:02:54.774875 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:54.774830 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" podStartSLOduration=0.863951987 podStartE2EDuration="35.774819901s" podCreationTimestamp="2026-04-17 08:02:19 +0000 UTC" firstStartedPulling="2026-04-17 08:02:19.628993781 +0000 UTC m=+599.438004488" lastFinishedPulling="2026-04-17 08:02:54.539861691 +0000 UTC m=+634.348872402" observedRunningTime="2026-04-17 08:02:54.773642233 +0000 UTC m=+634.582652963" watchObservedRunningTime="2026-04-17 08:02:54.774819901 +0000 UTC m=+634.583830631" Apr 17 08:02:55.763576 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:02:55.763539 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 17 08:03:00.713479 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:00.713432 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 08:03:00.713845 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:00.713432 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 17 08:03:03.681851 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:03.681807 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" podUID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 17 08:03:05.763548 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:05.763504 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 17 08:03:10.713280 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:10.713223 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 08:03:10.713837 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:10.713235 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 17 08:03:13.681122 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:13.681075 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" podUID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 17 08:03:15.763727 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:15.763679 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 17 08:03:20.713521 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:20.713471 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 17 08:03:20.713969 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:20.713471 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 08:03:23.681595 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:23.681545 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" podUID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 17 08:03:25.764092 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:25.764050 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 17 08:03:30.713878 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:30.713831 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 17 08:03:30.714422 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:30.713831 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 08:03:33.682667 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:33.682630 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" Apr 17 08:03:35.764542 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:35.764493 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 17 08:03:40.713465 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:40.713425 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 17 08:03:40.713916 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:40.713422 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 08:03:45.764011 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:45.763965 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 17 08:03:48.830473 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:48.830445 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" Apr 17 08:03:50.714368 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:50.714336 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" Apr 17 08:03:53.384985 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.384948 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs"] Apr 17 08:03:53.385412 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.385269 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" podUID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" containerName="kserve-container" containerID="cri-o://2a3d2df3d9c2da522702be1a6c83d11e83ea9662a4140171cd43e599d16d7122" gracePeriod=30 Apr 17 08:03:53.402946 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.402921 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx"] Apr 17 08:03:53.403257 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.403244 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c6e4478-2fb3-4be0-8055-71511aca685f" containerName="console" Apr 17 08:03:53.403305 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.403260 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c6e4478-2fb3-4be0-8055-71511aca685f" containerName="console" Apr 17 08:03:53.403356 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.403343 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c6e4478-2fb3-4be0-8055-71511aca685f" containerName="console" Apr 17 08:03:53.406164 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.406150 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" Apr 17 08:03:53.411964 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.411940 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx"] Apr 17 08:03:53.416663 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.416643 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" Apr 17 08:03:53.538298 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.538223 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx"] Apr 17 08:03:53.541355 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:03:53.541304 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcae50485_7a94_4576_8aa1_6fd93af26606.slice/crio-cf74d6e0f412b619679e714ccfe9f28b9f3e5f8d3f6baa284852b7b3725e8858 WatchSource:0}: Error finding container cf74d6e0f412b619679e714ccfe9f28b9f3e5f8d3f6baa284852b7b3725e8858: Status 404 returned error can't find the container with id cf74d6e0f412b619679e714ccfe9f28b9f3e5f8d3f6baa284852b7b3725e8858 Apr 17 08:03:53.543351 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.543308 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:03:53.681299 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.681266 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" podUID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 17 08:03:53.933054 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.932965 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" event={"ID":"cae50485-7a94-4576-8aa1-6fd93af26606","Type":"ContainerStarted","Data":"2cbcd3348271dfd8b2795f725864ab0c96c50a32035ce6b7e93070c5e186353d"} Apr 17 08:03:53.933054 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.933008 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" event={"ID":"cae50485-7a94-4576-8aa1-6fd93af26606","Type":"ContainerStarted","Data":"cf74d6e0f412b619679e714ccfe9f28b9f3e5f8d3f6baa284852b7b3725e8858"} Apr 17 08:03:53.933228 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.933178 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" Apr 17 08:03:53.934535 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.934505 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" podUID="cae50485-7a94-4576-8aa1-6fd93af26606" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 17 08:03:53.949104 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:53.949058 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" podStartSLOduration=0.949044502 podStartE2EDuration="949.044502ms" podCreationTimestamp="2026-04-17 08:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:03:53.94620103 +0000 UTC m=+693.755211761" watchObservedRunningTime="2026-04-17 08:03:53.949044502 +0000 UTC m=+693.758055232" Apr 17 08:03:54.937230 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:54.937194 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" podUID="cae50485-7a94-4576-8aa1-6fd93af26606" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 17 08:03:55.765013 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:55.764978 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" Apr 17 08:03:56.536481 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:56.536457 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" Apr 17 08:03:56.945290 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:56.945254 2570 generic.go:358] "Generic (PLEG): container finished" podID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" containerID="2a3d2df3d9c2da522702be1a6c83d11e83ea9662a4140171cd43e599d16d7122" exitCode=0 Apr 17 08:03:56.945463 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:56.945308 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" event={"ID":"c14a4091-fd8e-410c-a9bc-8a3dd336f24f","Type":"ContainerDied","Data":"2a3d2df3d9c2da522702be1a6c83d11e83ea9662a4140171cd43e599d16d7122"} Apr 17 08:03:56.945463 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:56.945339 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" Apr 17 08:03:56.945463 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:56.945356 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs" event={"ID":"c14a4091-fd8e-410c-a9bc-8a3dd336f24f","Type":"ContainerDied","Data":"ad5d2575bdca03cbb69804625673ae97ad743a6451f460fa677e310da51d3046"} Apr 17 08:03:56.945463 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:56.945371 2570 scope.go:117] "RemoveContainer" containerID="2a3d2df3d9c2da522702be1a6c83d11e83ea9662a4140171cd43e599d16d7122" Apr 17 08:03:56.953330 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:56.953291 2570 scope.go:117] "RemoveContainer" containerID="2a3d2df3d9c2da522702be1a6c83d11e83ea9662a4140171cd43e599d16d7122" Apr 17 08:03:56.953582 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:03:56.953563 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a3d2df3d9c2da522702be1a6c83d11e83ea9662a4140171cd43e599d16d7122\": container with ID starting with 2a3d2df3d9c2da522702be1a6c83d11e83ea9662a4140171cd43e599d16d7122 not found: ID does not exist" containerID="2a3d2df3d9c2da522702be1a6c83d11e83ea9662a4140171cd43e599d16d7122" Apr 17 08:03:56.953657 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:56.953594 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a3d2df3d9c2da522702be1a6c83d11e83ea9662a4140171cd43e599d16d7122"} err="failed to get container status \"2a3d2df3d9c2da522702be1a6c83d11e83ea9662a4140171cd43e599d16d7122\": rpc error: code = NotFound desc = could not find container \"2a3d2df3d9c2da522702be1a6c83d11e83ea9662a4140171cd43e599d16d7122\": container with ID starting with 2a3d2df3d9c2da522702be1a6c83d11e83ea9662a4140171cd43e599d16d7122 not found: ID does not exist" Apr 17 08:03:56.959763 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:56.959728 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs"] Apr 17 08:03:56.963088 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:56.963064 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-08159-predictor-6c4676d8f5-mkkbs"] Apr 17 08:03:58.828979 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:03:58.828936 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" path="/var/lib/kubelet/pods/c14a4091-fd8e-410c-a9bc-8a3dd336f24f/volumes" Apr 17 08:04:04.938155 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:04.938113 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" podUID="cae50485-7a94-4576-8aa1-6fd93af26606" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 17 08:04:14.938055 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:14.937962 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" podUID="cae50485-7a94-4576-8aa1-6fd93af26606" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 17 08:04:24.938140 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:24.938094 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" podUID="cae50485-7a94-4576-8aa1-6fd93af26606" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 17 08:04:29.149123 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.149081 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn"] Apr 17 08:04:29.149571 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.149378 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="kserve-container" containerID="cri-o://6fd20bff3e667c96204aa94c89290055f373160ec5c010fbffc15bd6276bb7c3" gracePeriod=30 Apr 17 08:04:29.194786 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.194750 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz"] Apr 17 08:04:29.195052 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.195028 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="kserve-container" containerID="cri-o://9fe3a62f8d08a3e4b58cb3bec402622a5996a28efa178a39f7d5bf132958de4b" gracePeriod=30 Apr 17 08:04:29.276880 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.276845 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k"] Apr 17 08:04:29.277209 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.277197 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" containerName="kserve-container" Apr 17 08:04:29.277254 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.277211 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" containerName="kserve-container" Apr 17 08:04:29.277286 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.277264 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c14a4091-fd8e-410c-a9bc-8a3dd336f24f" containerName="kserve-container" Apr 17 08:04:29.281443 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.281416 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" Apr 17 08:04:29.282633 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.282610 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5"] Apr 17 08:04:29.282863 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.282843 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerName="kserve-container" containerID="cri-o://27319a78086a02b36594c0dc4b11b3414d4505d94f06aa7a942f56d9b501f3ba" gracePeriod=30 Apr 17 08:04:29.287908 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.287885 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k"] Apr 17 08:04:29.292699 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.292675 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" Apr 17 08:04:29.425709 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:29.425631 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k"] Apr 17 08:04:29.428874 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:04:29.428840 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4189fa80_801e_4640_a9a4_68c294737e2d.slice/crio-3d137a1a2820acc5248690d37afd55b9988bfcef36de535ccea098b27ce8a88e WatchSource:0}: Error finding container 3d137a1a2820acc5248690d37afd55b9988bfcef36de535ccea098b27ce8a88e: Status 404 returned error can't find the container with id 3d137a1a2820acc5248690d37afd55b9988bfcef36de535ccea098b27ce8a88e Apr 17 08:04:30.042688 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:30.042654 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" event={"ID":"4189fa80-801e-4640-a9a4-68c294737e2d","Type":"ContainerStarted","Data":"b9e9fa3bc2dec791100aa333edc6302e68873ff808622b4f0c0a4e1dcf525b01"} Apr 17 08:04:30.042688 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:30.042691 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" event={"ID":"4189fa80-801e-4640-a9a4-68c294737e2d","Type":"ContainerStarted","Data":"3d137a1a2820acc5248690d37afd55b9988bfcef36de535ccea098b27ce8a88e"} Apr 17 08:04:30.042888 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:30.042810 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" Apr 17 08:04:30.044178 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:30.044148 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" podUID="4189fa80-801e-4640-a9a4-68c294737e2d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 17 08:04:30.057167 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:30.057101 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" podStartSLOduration=1.057078044 podStartE2EDuration="1.057078044s" podCreationTimestamp="2026-04-17 08:04:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:04:30.056695361 +0000 UTC m=+729.865706115" watchObservedRunningTime="2026-04-17 08:04:30.057078044 +0000 UTC m=+729.866088775" Apr 17 08:04:30.714060 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:30.714010 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 17 08:04:31.045498 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:31.045401 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" podUID="4189fa80-801e-4640-a9a4-68c294737e2d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 17 08:04:33.235298 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:33.235272 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" Apr 17 08:04:33.280799 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:33.280760 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0e16d2-e569-49ed-8a3c-387d722dbe93-kserve-provision-location\") pod \"ed0e16d2-e569-49ed-8a3c-387d722dbe93\" (UID: \"ed0e16d2-e569-49ed-8a3c-387d722dbe93\") " Apr 17 08:04:33.281136 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:33.281107 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed0e16d2-e569-49ed-8a3c-387d722dbe93-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ed0e16d2-e569-49ed-8a3c-387d722dbe93" (UID: "ed0e16d2-e569-49ed-8a3c-387d722dbe93"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:04:33.381471 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:33.381432 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0e16d2-e569-49ed-8a3c-387d722dbe93-kserve-provision-location\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 08:04:34.055181 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.055140 2570 generic.go:358] "Generic (PLEG): container finished" podID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerID="27319a78086a02b36594c0dc4b11b3414d4505d94f06aa7a942f56d9b501f3ba" exitCode=0 Apr 17 08:04:34.055407 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.055221 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" Apr 17 08:04:34.055407 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.055225 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" event={"ID":"ed0e16d2-e569-49ed-8a3c-387d722dbe93","Type":"ContainerDied","Data":"27319a78086a02b36594c0dc4b11b3414d4505d94f06aa7a942f56d9b501f3ba"} Apr 17 08:04:34.055407 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.055264 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5" event={"ID":"ed0e16d2-e569-49ed-8a3c-387d722dbe93","Type":"ContainerDied","Data":"a2b4dd1dbc8957e49395163df3dd026df477cec0e40d4599125816c6f9e09f13"} Apr 17 08:04:34.055407 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.055286 2570 scope.go:117] "RemoveContainer" containerID="27319a78086a02b36594c0dc4b11b3414d4505d94f06aa7a942f56d9b501f3ba" Apr 17 08:04:34.066468 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.066436 2570 scope.go:117] "RemoveContainer" containerID="690e4548f54fdc18db2de0030319c6826219469a9f1f31666d4c117b81bad5f3" Apr 17 08:04:34.074632 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.074519 2570 scope.go:117] "RemoveContainer" containerID="27319a78086a02b36594c0dc4b11b3414d4505d94f06aa7a942f56d9b501f3ba" Apr 17 08:04:34.074993 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:04:34.074963 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27319a78086a02b36594c0dc4b11b3414d4505d94f06aa7a942f56d9b501f3ba\": container with ID starting with 27319a78086a02b36594c0dc4b11b3414d4505d94f06aa7a942f56d9b501f3ba not found: ID does not exist" containerID="27319a78086a02b36594c0dc4b11b3414d4505d94f06aa7a942f56d9b501f3ba" Apr 17 08:04:34.075069 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.075007 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27319a78086a02b36594c0dc4b11b3414d4505d94f06aa7a942f56d9b501f3ba"} err="failed to get container status \"27319a78086a02b36594c0dc4b11b3414d4505d94f06aa7a942f56d9b501f3ba\": rpc error: code = NotFound desc = could not find container \"27319a78086a02b36594c0dc4b11b3414d4505d94f06aa7a942f56d9b501f3ba\": container with ID starting with 27319a78086a02b36594c0dc4b11b3414d4505d94f06aa7a942f56d9b501f3ba not found: ID does not exist" Apr 17 08:04:34.075069 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.075033 2570 scope.go:117] "RemoveContainer" containerID="690e4548f54fdc18db2de0030319c6826219469a9f1f31666d4c117b81bad5f3" Apr 17 08:04:34.075340 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:04:34.075298 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"690e4548f54fdc18db2de0030319c6826219469a9f1f31666d4c117b81bad5f3\": container with ID starting with 690e4548f54fdc18db2de0030319c6826219469a9f1f31666d4c117b81bad5f3 not found: ID does not exist" containerID="690e4548f54fdc18db2de0030319c6826219469a9f1f31666d4c117b81bad5f3" Apr 17 08:04:34.075396 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.075349 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"690e4548f54fdc18db2de0030319c6826219469a9f1f31666d4c117b81bad5f3"} err="failed to get container status \"690e4548f54fdc18db2de0030319c6826219469a9f1f31666d4c117b81bad5f3\": rpc error: code = NotFound desc = could not find container \"690e4548f54fdc18db2de0030319c6826219469a9f1f31666d4c117b81bad5f3\": container with ID starting with 690e4548f54fdc18db2de0030319c6826219469a9f1f31666d4c117b81bad5f3 not found: ID does not exist" Apr 17 08:04:34.076043 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.076025 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5"] Apr 17 08:04:34.078908 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.078887 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-zwcb5"] Apr 17 08:04:34.829149 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.829116 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" path="/var/lib/kubelet/pods/ed0e16d2-e569-49ed-8a3c-387d722dbe93/volumes" Apr 17 08:04:34.918587 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.918557 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" Apr 17 08:04:34.921580 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.921556 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" Apr 17 08:04:34.937352 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.937298 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" podUID="cae50485-7a94-4576-8aa1-6fd93af26606" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 17 08:04:34.993192 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.993151 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9641e509-6813-4e2f-97d4-efc806374a09-kserve-provision-location\") pod \"9641e509-6813-4e2f-97d4-efc806374a09\" (UID: \"9641e509-6813-4e2f-97d4-efc806374a09\") " Apr 17 08:04:34.993192 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.993200 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dac5afcb-a1b8-43e1-8ffa-f1579954ac75-kserve-provision-location\") pod \"dac5afcb-a1b8-43e1-8ffa-f1579954ac75\" (UID: \"dac5afcb-a1b8-43e1-8ffa-f1579954ac75\") " Apr 17 08:04:34.993539 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.993513 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9641e509-6813-4e2f-97d4-efc806374a09-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9641e509-6813-4e2f-97d4-efc806374a09" (UID: "9641e509-6813-4e2f-97d4-efc806374a09"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:04:34.993583 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:34.993559 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dac5afcb-a1b8-43e1-8ffa-f1579954ac75-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dac5afcb-a1b8-43e1-8ffa-f1579954ac75" (UID: "dac5afcb-a1b8-43e1-8ffa-f1579954ac75"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:04:35.060874 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.060835 2570 generic.go:358] "Generic (PLEG): container finished" podID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerID="9fe3a62f8d08a3e4b58cb3bec402622a5996a28efa178a39f7d5bf132958de4b" exitCode=0 Apr 17 08:04:35.061065 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.060915 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" Apr 17 08:04:35.061065 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.060929 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" event={"ID":"dac5afcb-a1b8-43e1-8ffa-f1579954ac75","Type":"ContainerDied","Data":"9fe3a62f8d08a3e4b58cb3bec402622a5996a28efa178a39f7d5bf132958de4b"} Apr 17 08:04:35.061065 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.060974 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz" event={"ID":"dac5afcb-a1b8-43e1-8ffa-f1579954ac75","Type":"ContainerDied","Data":"26e209bc05b0b042a7955043f1939e4a96efbe54122336d29c43225988c8cdbc"} Apr 17 08:04:35.061065 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.060994 2570 scope.go:117] "RemoveContainer" containerID="9fe3a62f8d08a3e4b58cb3bec402622a5996a28efa178a39f7d5bf132958de4b" Apr 17 08:04:35.062671 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.062642 2570 generic.go:358] "Generic (PLEG): container finished" podID="9641e509-6813-4e2f-97d4-efc806374a09" containerID="6fd20bff3e667c96204aa94c89290055f373160ec5c010fbffc15bd6276bb7c3" exitCode=0 Apr 17 08:04:35.062797 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.062690 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" event={"ID":"9641e509-6813-4e2f-97d4-efc806374a09","Type":"ContainerDied","Data":"6fd20bff3e667c96204aa94c89290055f373160ec5c010fbffc15bd6276bb7c3"} Apr 17 08:04:35.062797 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.062710 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" Apr 17 08:04:35.062797 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.062749 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn" event={"ID":"9641e509-6813-4e2f-97d4-efc806374a09","Type":"ContainerDied","Data":"c3a1976909fdb58b3ffeca0db608a9832c82fc88287d873b3c4e63c7d4008e1c"} Apr 17 08:04:35.069590 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.069567 2570 scope.go:117] "RemoveContainer" containerID="ac22ed9ed9cc4e31121ff083286da65189a50d84a770943ae6f0ace9c5b672c7" Apr 17 08:04:35.077180 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.077157 2570 scope.go:117] "RemoveContainer" containerID="9fe3a62f8d08a3e4b58cb3bec402622a5996a28efa178a39f7d5bf132958de4b" Apr 17 08:04:35.077465 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:04:35.077443 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fe3a62f8d08a3e4b58cb3bec402622a5996a28efa178a39f7d5bf132958de4b\": container with ID starting with 9fe3a62f8d08a3e4b58cb3bec402622a5996a28efa178a39f7d5bf132958de4b not found: ID does not exist" containerID="9fe3a62f8d08a3e4b58cb3bec402622a5996a28efa178a39f7d5bf132958de4b" Apr 17 08:04:35.077538 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.077473 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fe3a62f8d08a3e4b58cb3bec402622a5996a28efa178a39f7d5bf132958de4b"} err="failed to get container status \"9fe3a62f8d08a3e4b58cb3bec402622a5996a28efa178a39f7d5bf132958de4b\": rpc error: code = NotFound desc = could not find container \"9fe3a62f8d08a3e4b58cb3bec402622a5996a28efa178a39f7d5bf132958de4b\": container with ID starting with 9fe3a62f8d08a3e4b58cb3bec402622a5996a28efa178a39f7d5bf132958de4b not found: ID does not exist" Apr 17 08:04:35.077538 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.077492 2570 scope.go:117] "RemoveContainer" containerID="ac22ed9ed9cc4e31121ff083286da65189a50d84a770943ae6f0ace9c5b672c7" Apr 17 08:04:35.077755 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:04:35.077736 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac22ed9ed9cc4e31121ff083286da65189a50d84a770943ae6f0ace9c5b672c7\": container with ID starting with ac22ed9ed9cc4e31121ff083286da65189a50d84a770943ae6f0ace9c5b672c7 not found: ID does not exist" containerID="ac22ed9ed9cc4e31121ff083286da65189a50d84a770943ae6f0ace9c5b672c7" Apr 17 08:04:35.077815 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.077765 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac22ed9ed9cc4e31121ff083286da65189a50d84a770943ae6f0ace9c5b672c7"} err="failed to get container status \"ac22ed9ed9cc4e31121ff083286da65189a50d84a770943ae6f0ace9c5b672c7\": rpc error: code = NotFound desc = could not find container \"ac22ed9ed9cc4e31121ff083286da65189a50d84a770943ae6f0ace9c5b672c7\": container with ID starting with ac22ed9ed9cc4e31121ff083286da65189a50d84a770943ae6f0ace9c5b672c7 not found: ID does not exist" Apr 17 08:04:35.077815 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.077787 2570 scope.go:117] "RemoveContainer" containerID="6fd20bff3e667c96204aa94c89290055f373160ec5c010fbffc15bd6276bb7c3" Apr 17 08:04:35.084188 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.084163 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn"] Apr 17 08:04:35.085455 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.085435 2570 scope.go:117] "RemoveContainer" containerID="31449d9bbbffb6239083493998c52fecf413f60faab9fd14bcdd43b0862792cf" Apr 17 08:04:35.087975 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.087954 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-mc4zn"] Apr 17 08:04:35.093428 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.093396 2570 scope.go:117] "RemoveContainer" containerID="6fd20bff3e667c96204aa94c89290055f373160ec5c010fbffc15bd6276bb7c3" Apr 17 08:04:35.093739 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:04:35.093718 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd20bff3e667c96204aa94c89290055f373160ec5c010fbffc15bd6276bb7c3\": container with ID starting with 6fd20bff3e667c96204aa94c89290055f373160ec5c010fbffc15bd6276bb7c3 not found: ID does not exist" containerID="6fd20bff3e667c96204aa94c89290055f373160ec5c010fbffc15bd6276bb7c3" Apr 17 08:04:35.093801 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.093751 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd20bff3e667c96204aa94c89290055f373160ec5c010fbffc15bd6276bb7c3"} err="failed to get container status \"6fd20bff3e667c96204aa94c89290055f373160ec5c010fbffc15bd6276bb7c3\": rpc error: code = NotFound desc = could not find container \"6fd20bff3e667c96204aa94c89290055f373160ec5c010fbffc15bd6276bb7c3\": container with ID starting with 6fd20bff3e667c96204aa94c89290055f373160ec5c010fbffc15bd6276bb7c3 not found: ID does not exist" Apr 17 08:04:35.093801 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.093771 2570 scope.go:117] "RemoveContainer" containerID="31449d9bbbffb6239083493998c52fecf413f60faab9fd14bcdd43b0862792cf" Apr 17 08:04:35.093896 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.093726 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9641e509-6813-4e2f-97d4-efc806374a09-kserve-provision-location\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 08:04:35.093896 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.093835 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dac5afcb-a1b8-43e1-8ffa-f1579954ac75-kserve-provision-location\") on node \"ip-10-0-138-233.ec2.internal\" DevicePath \"\"" Apr 17 08:04:35.094017 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:04:35.093999 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31449d9bbbffb6239083493998c52fecf413f60faab9fd14bcdd43b0862792cf\": container with ID starting with 31449d9bbbffb6239083493998c52fecf413f60faab9fd14bcdd43b0862792cf not found: ID does not exist" containerID="31449d9bbbffb6239083493998c52fecf413f60faab9fd14bcdd43b0862792cf" Apr 17 08:04:35.094056 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.094022 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31449d9bbbffb6239083493998c52fecf413f60faab9fd14bcdd43b0862792cf"} err="failed to get container status \"31449d9bbbffb6239083493998c52fecf413f60faab9fd14bcdd43b0862792cf\": rpc error: code = NotFound desc = could not find container \"31449d9bbbffb6239083493998c52fecf413f60faab9fd14bcdd43b0862792cf\": container with ID starting with 31449d9bbbffb6239083493998c52fecf413f60faab9fd14bcdd43b0862792cf not found: ID does not exist" Apr 17 08:04:35.097050 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.097030 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz"] Apr 17 08:04:35.100783 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:35.100757 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-444sz"] Apr 17 08:04:36.830053 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:36.830022 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9641e509-6813-4e2f-97d4-efc806374a09" path="/var/lib/kubelet/pods/9641e509-6813-4e2f-97d4-efc806374a09/volumes" Apr 17 08:04:36.830450 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:36.830433 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" path="/var/lib/kubelet/pods/dac5afcb-a1b8-43e1-8ffa-f1579954ac75/volumes" Apr 17 08:04:41.045686 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:41.045639 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" podUID="4189fa80-801e-4640-a9a4-68c294737e2d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 17 08:04:44.938721 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:44.938684 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" Apr 17 08:04:51.046050 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:04:51.046010 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" podUID="4189fa80-801e-4640-a9a4-68c294737e2d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 17 08:05:01.046376 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:05:01.046298 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" podUID="4189fa80-801e-4640-a9a4-68c294737e2d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 17 08:05:11.045632 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:05:11.045583 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" podUID="4189fa80-801e-4640-a9a4-68c294737e2d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 17 08:05:21.046501 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:05:21.046409 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" Apr 17 08:13:18.299693 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.299613 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx"] Apr 17 08:13:18.300189 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.299943 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" podUID="cae50485-7a94-4576-8aa1-6fd93af26606" containerName="kserve-container" containerID="cri-o://2cbcd3348271dfd8b2795f725864ab0c96c50a32035ce6b7e93070c5e186353d" gracePeriod=30 Apr 17 08:13:18.354709 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.354671 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh"] Apr 17 08:13:18.355072 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355055 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="storage-initializer" Apr 17 08:13:18.355154 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355075 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="storage-initializer" Apr 17 08:13:18.355154 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355096 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="storage-initializer" Apr 17 08:13:18.355154 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355105 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="storage-initializer" Apr 17 08:13:18.355154 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355114 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerName="kserve-container" Apr 17 08:13:18.355154 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355123 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerName="kserve-container" Apr 17 08:13:18.355154 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355137 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="kserve-container" Apr 17 08:13:18.355154 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355145 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="kserve-container" Apr 17 08:13:18.355154 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355155 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerName="storage-initializer" Apr 17 08:13:18.355546 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355164 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerName="storage-initializer" Apr 17 08:13:18.355546 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355174 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="kserve-container" Apr 17 08:13:18.355546 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355183 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="kserve-container" Apr 17 08:13:18.355546 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355270 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed0e16d2-e569-49ed-8a3c-387d722dbe93" containerName="kserve-container" Apr 17 08:13:18.355546 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355288 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="dac5afcb-a1b8-43e1-8ffa-f1579954ac75" containerName="kserve-container" Apr 17 08:13:18.355546 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.355299 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="9641e509-6813-4e2f-97d4-efc806374a09" containerName="kserve-container" Apr 17 08:13:18.358219 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.358196 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" Apr 17 08:13:18.364341 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.364291 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh"] Apr 17 08:13:18.369106 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.369083 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" Apr 17 08:13:18.498953 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.498905 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh"] Apr 17 08:13:18.500703 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:13:18.500672 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod241f9031_3f12_4b32_9353_ccb3e1a055d6.slice/crio-008b4c393df8d3fce9e469fb60c48352e534131139ed60324f4530fbf78365a1 WatchSource:0}: Error finding container 008b4c393df8d3fce9e469fb60c48352e534131139ed60324f4530fbf78365a1: Status 404 returned error can't find the container with id 008b4c393df8d3fce9e469fb60c48352e534131139ed60324f4530fbf78365a1 Apr 17 08:13:18.502819 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.502786 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:13:18.550697 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:18.550626 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" event={"ID":"241f9031-3f12-4b32-9353-ccb3e1a055d6","Type":"ContainerStarted","Data":"008b4c393df8d3fce9e469fb60c48352e534131139ed60324f4530fbf78365a1"} Apr 17 08:13:19.555578 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:19.555537 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" event={"ID":"241f9031-3f12-4b32-9353-ccb3e1a055d6","Type":"ContainerStarted","Data":"2eeebb8ac110532eedfe1841ad773bf9b2d2ea3acdf62b2c1539f7ea64a9a021"} Apr 17 08:13:19.556041 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:19.555799 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" Apr 17 08:13:19.556922 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:19.556891 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" podUID="241f9031-3f12-4b32-9353-ccb3e1a055d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 08:13:19.570775 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:19.570726 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" podStartSLOduration=1.570714905 podStartE2EDuration="1.570714905s" podCreationTimestamp="2026-04-17 08:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:13:19.56940067 +0000 UTC m=+1259.378411394" watchObservedRunningTime="2026-04-17 08:13:19.570714905 +0000 UTC m=+1259.379725635" Apr 17 08:13:20.559271 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:20.559228 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" podUID="241f9031-3f12-4b32-9353-ccb3e1a055d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 08:13:21.346233 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:21.346210 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" Apr 17 08:13:21.563368 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:21.563286 2570 generic.go:358] "Generic (PLEG): container finished" podID="cae50485-7a94-4576-8aa1-6fd93af26606" containerID="2cbcd3348271dfd8b2795f725864ab0c96c50a32035ce6b7e93070c5e186353d" exitCode=0 Apr 17 08:13:21.563747 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:21.563378 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" event={"ID":"cae50485-7a94-4576-8aa1-6fd93af26606","Type":"ContainerDied","Data":"2cbcd3348271dfd8b2795f725864ab0c96c50a32035ce6b7e93070c5e186353d"} Apr 17 08:13:21.563747 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:21.563389 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" Apr 17 08:13:21.563747 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:21.563413 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx" event={"ID":"cae50485-7a94-4576-8aa1-6fd93af26606","Type":"ContainerDied","Data":"cf74d6e0f412b619679e714ccfe9f28b9f3e5f8d3f6baa284852b7b3725e8858"} Apr 17 08:13:21.563747 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:21.563428 2570 scope.go:117] "RemoveContainer" containerID="2cbcd3348271dfd8b2795f725864ab0c96c50a32035ce6b7e93070c5e186353d" Apr 17 08:13:21.571528 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:21.571513 2570 scope.go:117] "RemoveContainer" containerID="2cbcd3348271dfd8b2795f725864ab0c96c50a32035ce6b7e93070c5e186353d" Apr 17 08:13:21.571774 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:13:21.571750 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cbcd3348271dfd8b2795f725864ab0c96c50a32035ce6b7e93070c5e186353d\": container with ID starting with 2cbcd3348271dfd8b2795f725864ab0c96c50a32035ce6b7e93070c5e186353d not found: ID does not exist" containerID="2cbcd3348271dfd8b2795f725864ab0c96c50a32035ce6b7e93070c5e186353d" Apr 17 08:13:21.571825 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:21.571784 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cbcd3348271dfd8b2795f725864ab0c96c50a32035ce6b7e93070c5e186353d"} err="failed to get container status \"2cbcd3348271dfd8b2795f725864ab0c96c50a32035ce6b7e93070c5e186353d\": rpc error: code = NotFound desc = could not find container \"2cbcd3348271dfd8b2795f725864ab0c96c50a32035ce6b7e93070c5e186353d\": container with ID starting with 2cbcd3348271dfd8b2795f725864ab0c96c50a32035ce6b7e93070c5e186353d not found: ID does not exist" Apr 17 08:13:21.582358 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:21.582334 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx"] Apr 17 08:13:21.586654 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:21.586635 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4828d-predictor-5cbf5bc797-n6xzx"] Apr 17 08:13:22.829189 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:22.829145 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae50485-7a94-4576-8aa1-6fd93af26606" path="/var/lib/kubelet/pods/cae50485-7a94-4576-8aa1-6fd93af26606/volumes" Apr 17 08:13:30.559293 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:30.559229 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" podUID="241f9031-3f12-4b32-9353-ccb3e1a055d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 08:13:40.559986 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:40.559931 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" podUID="241f9031-3f12-4b32-9353-ccb3e1a055d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 08:13:50.559776 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:50.559723 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" podUID="241f9031-3f12-4b32-9353-ccb3e1a055d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 08:13:54.133028 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.132997 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k"] Apr 17 08:13:54.135429 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.133240 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" podUID="4189fa80-801e-4640-a9a4-68c294737e2d" containerName="kserve-container" containerID="cri-o://b9e9fa3bc2dec791100aa333edc6302e68873ff808622b4f0c0a4e1dcf525b01" gracePeriod=30 Apr 17 08:13:54.171611 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.171580 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd"] Apr 17 08:13:54.171909 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.171897 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cae50485-7a94-4576-8aa1-6fd93af26606" containerName="kserve-container" Apr 17 08:13:54.171952 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.171911 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae50485-7a94-4576-8aa1-6fd93af26606" containerName="kserve-container" Apr 17 08:13:54.172027 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.171966 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="cae50485-7a94-4576-8aa1-6fd93af26606" containerName="kserve-container" Apr 17 08:13:54.174962 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.174942 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" Apr 17 08:13:54.180935 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.180910 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd"] Apr 17 08:13:54.185239 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.185218 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" Apr 17 08:13:54.301947 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.301912 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd"] Apr 17 08:13:54.304964 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:13:54.304942 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0180cef1_5b9f_4eed_b1e8_8060286f7ac7.slice/crio-6c1420a6b86c12265951f21a0292292d6a7e990362cb7e8cd18a86c8a2c02881 WatchSource:0}: Error finding container 6c1420a6b86c12265951f21a0292292d6a7e990362cb7e8cd18a86c8a2c02881: Status 404 returned error can't find the container with id 6c1420a6b86c12265951f21a0292292d6a7e990362cb7e8cd18a86c8a2c02881 Apr 17 08:13:54.667018 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.666975 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" event={"ID":"0180cef1-5b9f-4eed-b1e8-8060286f7ac7","Type":"ContainerStarted","Data":"c3273a8ffdd66819984104832c0a7d93916aa235ff031b964cb3c9ad911c65d6"} Apr 17 08:13:54.667018 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.667022 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" event={"ID":"0180cef1-5b9f-4eed-b1e8-8060286f7ac7","Type":"ContainerStarted","Data":"6c1420a6b86c12265951f21a0292292d6a7e990362cb7e8cd18a86c8a2c02881"} Apr 17 08:13:54.667247 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.667195 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" Apr 17 08:13:54.668533 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.668493 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" podUID="0180cef1-5b9f-4eed-b1e8-8060286f7ac7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 08:13:54.680921 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:54.680882 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" podStartSLOduration=0.680869137 podStartE2EDuration="680.869137ms" podCreationTimestamp="2026-04-17 08:13:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:13:54.679224154 +0000 UTC m=+1294.488234929" watchObservedRunningTime="2026-04-17 08:13:54.680869137 +0000 UTC m=+1294.489879867" Apr 17 08:13:55.671094 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:55.671050 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" podUID="0180cef1-5b9f-4eed-b1e8-8060286f7ac7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 08:13:57.377421 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:57.377398 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" Apr 17 08:13:57.677157 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:57.677129 2570 generic.go:358] "Generic (PLEG): container finished" podID="4189fa80-801e-4640-a9a4-68c294737e2d" containerID="b9e9fa3bc2dec791100aa333edc6302e68873ff808622b4f0c0a4e1dcf525b01" exitCode=0 Apr 17 08:13:57.677349 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:57.677189 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" Apr 17 08:13:57.677349 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:57.677204 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" event={"ID":"4189fa80-801e-4640-a9a4-68c294737e2d","Type":"ContainerDied","Data":"b9e9fa3bc2dec791100aa333edc6302e68873ff808622b4f0c0a4e1dcf525b01"} Apr 17 08:13:57.677349 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:57.677239 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k" event={"ID":"4189fa80-801e-4640-a9a4-68c294737e2d","Type":"ContainerDied","Data":"3d137a1a2820acc5248690d37afd55b9988bfcef36de535ccea098b27ce8a88e"} Apr 17 08:13:57.677349 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:57.677254 2570 scope.go:117] "RemoveContainer" containerID="b9e9fa3bc2dec791100aa333edc6302e68873ff808622b4f0c0a4e1dcf525b01" Apr 17 08:13:57.685599 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:57.685581 2570 scope.go:117] "RemoveContainer" containerID="b9e9fa3bc2dec791100aa333edc6302e68873ff808622b4f0c0a4e1dcf525b01" Apr 17 08:13:57.685888 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:13:57.685865 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9e9fa3bc2dec791100aa333edc6302e68873ff808622b4f0c0a4e1dcf525b01\": container with ID starting with b9e9fa3bc2dec791100aa333edc6302e68873ff808622b4f0c0a4e1dcf525b01 not found: ID does not exist" containerID="b9e9fa3bc2dec791100aa333edc6302e68873ff808622b4f0c0a4e1dcf525b01" Apr 17 08:13:57.685954 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:57.685899 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9e9fa3bc2dec791100aa333edc6302e68873ff808622b4f0c0a4e1dcf525b01"} err="failed to get container status \"b9e9fa3bc2dec791100aa333edc6302e68873ff808622b4f0c0a4e1dcf525b01\": rpc error: code = NotFound desc = could not find container \"b9e9fa3bc2dec791100aa333edc6302e68873ff808622b4f0c0a4e1dcf525b01\": container with ID starting with b9e9fa3bc2dec791100aa333edc6302e68873ff808622b4f0c0a4e1dcf525b01 not found: ID does not exist" Apr 17 08:13:57.698244 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:57.698216 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k"] Apr 17 08:13:57.700578 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:57.700556 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c3bdf-predictor-847869ccdf-76d9k"] Apr 17 08:13:58.828366 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:13:58.828331 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4189fa80-801e-4640-a9a4-68c294737e2d" path="/var/lib/kubelet/pods/4189fa80-801e-4640-a9a4-68c294737e2d/volumes" Apr 17 08:14:00.560159 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:00.560104 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" podUID="241f9031-3f12-4b32-9353-ccb3e1a055d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 08:14:05.671593 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:05.671553 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" podUID="0180cef1-5b9f-4eed-b1e8-8060286f7ac7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 08:14:10.560428 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:10.560391 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" Apr 17 08:14:15.671403 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:15.671361 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" podUID="0180cef1-5b9f-4eed-b1e8-8060286f7ac7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 08:14:25.672075 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:25.672032 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" podUID="0180cef1-5b9f-4eed-b1e8-8060286f7ac7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 08:14:35.671363 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:35.671298 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" podUID="0180cef1-5b9f-4eed-b1e8-8060286f7ac7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 08:14:38.632330 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:38.632284 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht"] Apr 17 08:14:38.632772 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:38.632637 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4189fa80-801e-4640-a9a4-68c294737e2d" containerName="kserve-container" Apr 17 08:14:38.632772 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:38.632651 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4189fa80-801e-4640-a9a4-68c294737e2d" containerName="kserve-container" Apr 17 08:14:38.632772 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:38.632703 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4189fa80-801e-4640-a9a4-68c294737e2d" containerName="kserve-container" Apr 17 08:14:38.636841 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:38.636813 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" Apr 17 08:14:38.644403 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:38.644378 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht"] Apr 17 08:14:38.647770 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:38.647747 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" Apr 17 08:14:38.655294 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:38.655270 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh"] Apr 17 08:14:38.655591 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:38.655560 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" podUID="241f9031-3f12-4b32-9353-ccb3e1a055d6" containerName="kserve-container" containerID="cri-o://2eeebb8ac110532eedfe1841ad773bf9b2d2ea3acdf62b2c1539f7ea64a9a021" gracePeriod=30 Apr 17 08:14:38.768194 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:38.768167 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht"] Apr 17 08:14:38.770687 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:14:38.770660 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fe9f135_f6a5_41f7_80e1_669507d408ce.slice/crio-6f16244c6f95be91d24c0c93c0ad4af5fcebc153fb4baeaa14d96343e31ee86a WatchSource:0}: Error finding container 6f16244c6f95be91d24c0c93c0ad4af5fcebc153fb4baeaa14d96343e31ee86a: Status 404 returned error can't find the container with id 6f16244c6f95be91d24c0c93c0ad4af5fcebc153fb4baeaa14d96343e31ee86a Apr 17 08:14:38.802101 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:38.802073 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" event={"ID":"1fe9f135-f6a5-41f7-80e1-669507d408ce","Type":"ContainerStarted","Data":"6f16244c6f95be91d24c0c93c0ad4af5fcebc153fb4baeaa14d96343e31ee86a"} Apr 17 08:14:39.806085 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:39.806045 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" event={"ID":"1fe9f135-f6a5-41f7-80e1-669507d408ce","Type":"ContainerStarted","Data":"831bbe64e2d8100e30357e2da09db58b01d0c0836db321504fe814a3092ecd5c"} Apr 17 08:14:39.806508 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:39.806343 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" Apr 17 08:14:39.807544 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:39.807516 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" podUID="1fe9f135-f6a5-41f7-80e1-669507d408ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 08:14:39.820503 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:39.820461 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" podStartSLOduration=1.820448967 podStartE2EDuration="1.820448967s" podCreationTimestamp="2026-04-17 08:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:14:39.818781885 +0000 UTC m=+1339.627792616" watchObservedRunningTime="2026-04-17 08:14:39.820448967 +0000 UTC m=+1339.629459696" Apr 17 08:14:40.559726 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:40.559676 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" podUID="241f9031-3f12-4b32-9353-ccb3e1a055d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 17 08:14:40.809426 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:40.809387 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" podUID="1fe9f135-f6a5-41f7-80e1-669507d408ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 08:14:41.700808 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:41.700785 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" Apr 17 08:14:41.813014 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:41.812932 2570 generic.go:358] "Generic (PLEG): container finished" podID="241f9031-3f12-4b32-9353-ccb3e1a055d6" containerID="2eeebb8ac110532eedfe1841ad773bf9b2d2ea3acdf62b2c1539f7ea64a9a021" exitCode=0 Apr 17 08:14:41.813014 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:41.812989 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" Apr 17 08:14:41.813470 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:41.813016 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" event={"ID":"241f9031-3f12-4b32-9353-ccb3e1a055d6","Type":"ContainerDied","Data":"2eeebb8ac110532eedfe1841ad773bf9b2d2ea3acdf62b2c1539f7ea64a9a021"} Apr 17 08:14:41.813470 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:41.813053 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh" event={"ID":"241f9031-3f12-4b32-9353-ccb3e1a055d6","Type":"ContainerDied","Data":"008b4c393df8d3fce9e469fb60c48352e534131139ed60324f4530fbf78365a1"} Apr 17 08:14:41.813470 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:41.813070 2570 scope.go:117] "RemoveContainer" containerID="2eeebb8ac110532eedfe1841ad773bf9b2d2ea3acdf62b2c1539f7ea64a9a021" Apr 17 08:14:41.821241 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:41.821222 2570 scope.go:117] "RemoveContainer" containerID="2eeebb8ac110532eedfe1841ad773bf9b2d2ea3acdf62b2c1539f7ea64a9a021" Apr 17 08:14:41.821493 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:14:41.821465 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eeebb8ac110532eedfe1841ad773bf9b2d2ea3acdf62b2c1539f7ea64a9a021\": container with ID starting with 2eeebb8ac110532eedfe1841ad773bf9b2d2ea3acdf62b2c1539f7ea64a9a021 not found: ID does not exist" containerID="2eeebb8ac110532eedfe1841ad773bf9b2d2ea3acdf62b2c1539f7ea64a9a021" Apr 17 08:14:41.821620 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:41.821494 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eeebb8ac110532eedfe1841ad773bf9b2d2ea3acdf62b2c1539f7ea64a9a021"} err="failed to get container status \"2eeebb8ac110532eedfe1841ad773bf9b2d2ea3acdf62b2c1539f7ea64a9a021\": rpc error: code = NotFound desc = could not find container \"2eeebb8ac110532eedfe1841ad773bf9b2d2ea3acdf62b2c1539f7ea64a9a021\": container with ID starting with 2eeebb8ac110532eedfe1841ad773bf9b2d2ea3acdf62b2c1539f7ea64a9a021 not found: ID does not exist" Apr 17 08:14:41.833130 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:41.833106 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh"] Apr 17 08:14:41.838785 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:41.838764 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390bf-predictor-bd86b758f-cd2fh"] Apr 17 08:14:42.828266 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:42.828232 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241f9031-3f12-4b32-9353-ccb3e1a055d6" path="/var/lib/kubelet/pods/241f9031-3f12-4b32-9353-ccb3e1a055d6/volumes" Apr 17 08:14:45.672587 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:45.672556 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" Apr 17 08:14:50.809814 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:14:50.809768 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" podUID="1fe9f135-f6a5-41f7-80e1-669507d408ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 08:15:00.809766 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:00.809711 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" podUID="1fe9f135-f6a5-41f7-80e1-669507d408ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 08:15:10.809853 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:10.809809 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" podUID="1fe9f135-f6a5-41f7-80e1-669507d408ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 08:15:14.294800 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.294759 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd"] Apr 17 08:15:14.295280 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.295037 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" podUID="0180cef1-5b9f-4eed-b1e8-8060286f7ac7" containerName="kserve-container" containerID="cri-o://c3273a8ffdd66819984104832c0a7d93916aa235ff031b964cb3c9ad911c65d6" gracePeriod=30 Apr 17 08:15:14.476049 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.476016 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92"] Apr 17 08:15:14.476384 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.476371 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="241f9031-3f12-4b32-9353-ccb3e1a055d6" containerName="kserve-container" Apr 17 08:15:14.476384 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.476385 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="241f9031-3f12-4b32-9353-ccb3e1a055d6" containerName="kserve-container" Apr 17 08:15:14.476475 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.476440 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="241f9031-3f12-4b32-9353-ccb3e1a055d6" containerName="kserve-container" Apr 17 08:15:14.479254 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.479232 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" Apr 17 08:15:14.488573 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.488546 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92"] Apr 17 08:15:14.489652 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.489635 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" Apr 17 08:15:14.610590 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.610564 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92"] Apr 17 08:15:14.612848 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:15:14.612820 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a08f91_43d7_44f2_ac02_5f1951a3fb98.slice/crio-22dfe8ffd88e796563ea50def604b0510f7a67694777b8d30dd84a49cb8d9bf5 WatchSource:0}: Error finding container 22dfe8ffd88e796563ea50def604b0510f7a67694777b8d30dd84a49cb8d9bf5: Status 404 returned error can't find the container with id 22dfe8ffd88e796563ea50def604b0510f7a67694777b8d30dd84a49cb8d9bf5 Apr 17 08:15:14.902336 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.902286 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" event={"ID":"51a08f91-43d7-44f2-ac02-5f1951a3fb98","Type":"ContainerStarted","Data":"137ddc84e1c188a6b9c3e2313011c071a69078209d8447ec6cd2f7ef6ad62cb6"} Apr 17 08:15:14.902336 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.902334 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" event={"ID":"51a08f91-43d7-44f2-ac02-5f1951a3fb98","Type":"ContainerStarted","Data":"22dfe8ffd88e796563ea50def604b0510f7a67694777b8d30dd84a49cb8d9bf5"} Apr 17 08:15:14.902572 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.902487 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" Apr 17 08:15:14.903768 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.903746 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" podUID="51a08f91-43d7-44f2-ac02-5f1951a3fb98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 08:15:14.914970 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:14.914923 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" podStartSLOduration=0.914906272 podStartE2EDuration="914.906272ms" podCreationTimestamp="2026-04-17 08:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:15:14.914422844 +0000 UTC m=+1374.723433584" watchObservedRunningTime="2026-04-17 08:15:14.914906272 +0000 UTC m=+1374.723917003" Apr 17 08:15:15.671557 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:15.671517 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" podUID="0180cef1-5b9f-4eed-b1e8-8060286f7ac7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 17 08:15:15.905367 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:15.905328 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" podUID="51a08f91-43d7-44f2-ac02-5f1951a3fb98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 08:15:17.439431 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:17.439408 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" Apr 17 08:15:17.911094 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:17.911062 2570 generic.go:358] "Generic (PLEG): container finished" podID="0180cef1-5b9f-4eed-b1e8-8060286f7ac7" containerID="c3273a8ffdd66819984104832c0a7d93916aa235ff031b964cb3c9ad911c65d6" exitCode=0 Apr 17 08:15:17.911264 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:17.911119 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" Apr 17 08:15:17.911264 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:17.911153 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" event={"ID":"0180cef1-5b9f-4eed-b1e8-8060286f7ac7","Type":"ContainerDied","Data":"c3273a8ffdd66819984104832c0a7d93916aa235ff031b964cb3c9ad911c65d6"} Apr 17 08:15:17.911264 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:17.911198 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd" event={"ID":"0180cef1-5b9f-4eed-b1e8-8060286f7ac7","Type":"ContainerDied","Data":"6c1420a6b86c12265951f21a0292292d6a7e990362cb7e8cd18a86c8a2c02881"} Apr 17 08:15:17.911264 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:17.911218 2570 scope.go:117] "RemoveContainer" containerID="c3273a8ffdd66819984104832c0a7d93916aa235ff031b964cb3c9ad911c65d6" Apr 17 08:15:17.921691 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:17.921673 2570 scope.go:117] "RemoveContainer" containerID="c3273a8ffdd66819984104832c0a7d93916aa235ff031b964cb3c9ad911c65d6" Apr 17 08:15:17.921963 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:15:17.921939 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3273a8ffdd66819984104832c0a7d93916aa235ff031b964cb3c9ad911c65d6\": container with ID starting with c3273a8ffdd66819984104832c0a7d93916aa235ff031b964cb3c9ad911c65d6 not found: ID does not exist" containerID="c3273a8ffdd66819984104832c0a7d93916aa235ff031b964cb3c9ad911c65d6" Apr 17 08:15:17.922054 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:17.921964 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3273a8ffdd66819984104832c0a7d93916aa235ff031b964cb3c9ad911c65d6"} err="failed to get container status \"c3273a8ffdd66819984104832c0a7d93916aa235ff031b964cb3c9ad911c65d6\": rpc error: code = NotFound desc = could not find container \"c3273a8ffdd66819984104832c0a7d93916aa235ff031b964cb3c9ad911c65d6\": container with ID starting with c3273a8ffdd66819984104832c0a7d93916aa235ff031b964cb3c9ad911c65d6 not found: ID does not exist" Apr 17 08:15:17.931298 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:17.931275 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd"] Apr 17 08:15:17.934361 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:17.934338 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-375fe-predictor-85794dd7c-zw8vd"] Apr 17 08:15:18.828890 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:18.828855 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0180cef1-5b9f-4eed-b1e8-8060286f7ac7" path="/var/lib/kubelet/pods/0180cef1-5b9f-4eed-b1e8-8060286f7ac7/volumes" Apr 17 08:15:20.810232 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:20.810200 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" podUID="1fe9f135-f6a5-41f7-80e1-669507d408ce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 08:15:25.905714 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:25.905663 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" podUID="51a08f91-43d7-44f2-ac02-5f1951a3fb98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 08:15:30.811500 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:30.811469 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" Apr 17 08:15:35.906452 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:35.906403 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" podUID="51a08f91-43d7-44f2-ac02-5f1951a3fb98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 08:15:45.905544 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:45.905502 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" podUID="51a08f91-43d7-44f2-ac02-5f1951a3fb98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 08:15:55.905640 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:15:55.905577 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" podUID="51a08f91-43d7-44f2-ac02-5f1951a3fb98" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 08:16:05.906515 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:16:05.906477 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" Apr 17 08:24:03.503059 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:03.502971 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht"] Apr 17 08:24:03.504006 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:03.503232 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" podUID="1fe9f135-f6a5-41f7-80e1-669507d408ce" containerName="kserve-container" containerID="cri-o://831bbe64e2d8100e30357e2da09db58b01d0c0836db321504fe814a3092ecd5c" gracePeriod=30 Apr 17 08:24:03.544065 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:03.544030 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf"] Apr 17 08:24:03.544377 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:03.544362 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0180cef1-5b9f-4eed-b1e8-8060286f7ac7" containerName="kserve-container" Apr 17 08:24:03.544434 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:03.544378 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0180cef1-5b9f-4eed-b1e8-8060286f7ac7" containerName="kserve-container" Apr 17 08:24:03.544469 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:03.544444 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0180cef1-5b9f-4eed-b1e8-8060286f7ac7" containerName="kserve-container" Apr 17 08:24:03.547244 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:03.547225 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" Apr 17 08:24:03.556854 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:03.556830 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf"] Apr 17 08:24:03.556970 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:03.556953 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" Apr 17 08:24:03.678362 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:03.678243 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf"] Apr 17 08:24:03.681024 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:24:03.680988 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb684391_6aeb_41a8_a504_268ba4376434.slice/crio-29de89446782b117978397bbf352f8f87c019cf4ac4efb6545146a20284bb227 WatchSource:0}: Error finding container 29de89446782b117978397bbf352f8f87c019cf4ac4efb6545146a20284bb227: Status 404 returned error can't find the container with id 29de89446782b117978397bbf352f8f87c019cf4ac4efb6545146a20284bb227 Apr 17 08:24:03.683032 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:03.683013 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:24:04.396620 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:04.396588 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" event={"ID":"cb684391-6aeb-41a8-a504-268ba4376434","Type":"ContainerStarted","Data":"45a09a955c0d1808401859763524a25a7b366d34f29029dd2bd4041e03021f8f"} Apr 17 08:24:04.396620 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:04.396621 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" event={"ID":"cb684391-6aeb-41a8-a504-268ba4376434","Type":"ContainerStarted","Data":"29de89446782b117978397bbf352f8f87c019cf4ac4efb6545146a20284bb227"} Apr 17 08:24:04.396827 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:04.396790 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" Apr 17 08:24:04.398064 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:04.398038 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" podUID="cb684391-6aeb-41a8-a504-268ba4376434" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 08:24:04.411520 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:04.411484 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" podStartSLOduration=1.411472499 podStartE2EDuration="1.411472499s" podCreationTimestamp="2026-04-17 08:24:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:24:04.4097053 +0000 UTC m=+1904.218716031" watchObservedRunningTime="2026-04-17 08:24:04.411472499 +0000 UTC m=+1904.220483228" Apr 17 08:24:05.399166 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:05.399131 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" podUID="cb684391-6aeb-41a8-a504-268ba4376434" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 08:24:06.541291 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:06.541268 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" Apr 17 08:24:07.404982 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:07.404951 2570 generic.go:358] "Generic (PLEG): container finished" podID="1fe9f135-f6a5-41f7-80e1-669507d408ce" containerID="831bbe64e2d8100e30357e2da09db58b01d0c0836db321504fe814a3092ecd5c" exitCode=0 Apr 17 08:24:07.405170 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:07.405013 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" Apr 17 08:24:07.405170 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:07.405030 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" event={"ID":"1fe9f135-f6a5-41f7-80e1-669507d408ce","Type":"ContainerDied","Data":"831bbe64e2d8100e30357e2da09db58b01d0c0836db321504fe814a3092ecd5c"} Apr 17 08:24:07.405170 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:07.405087 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht" event={"ID":"1fe9f135-f6a5-41f7-80e1-669507d408ce","Type":"ContainerDied","Data":"6f16244c6f95be91d24c0c93c0ad4af5fcebc153fb4baeaa14d96343e31ee86a"} Apr 17 08:24:07.405170 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:07.405105 2570 scope.go:117] "RemoveContainer" containerID="831bbe64e2d8100e30357e2da09db58b01d0c0836db321504fe814a3092ecd5c" Apr 17 08:24:07.415262 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:07.415240 2570 scope.go:117] "RemoveContainer" containerID="831bbe64e2d8100e30357e2da09db58b01d0c0836db321504fe814a3092ecd5c" Apr 17 08:24:07.415739 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:24:07.415706 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"831bbe64e2d8100e30357e2da09db58b01d0c0836db321504fe814a3092ecd5c\": container with ID starting with 831bbe64e2d8100e30357e2da09db58b01d0c0836db321504fe814a3092ecd5c not found: ID does not exist" containerID="831bbe64e2d8100e30357e2da09db58b01d0c0836db321504fe814a3092ecd5c" Apr 17 08:24:07.415840 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:07.415751 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"831bbe64e2d8100e30357e2da09db58b01d0c0836db321504fe814a3092ecd5c"} err="failed to get container status \"831bbe64e2d8100e30357e2da09db58b01d0c0836db321504fe814a3092ecd5c\": rpc error: code = NotFound desc = could not find container \"831bbe64e2d8100e30357e2da09db58b01d0c0836db321504fe814a3092ecd5c\": container with ID starting with 831bbe64e2d8100e30357e2da09db58b01d0c0836db321504fe814a3092ecd5c not found: ID does not exist" Apr 17 08:24:07.420473 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:07.420450 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht"] Apr 17 08:24:07.423833 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:07.423813 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65d2e-predictor-8468db4849-mbmht"] Apr 17 08:24:08.829074 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:08.829037 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe9f135-f6a5-41f7-80e1-669507d408ce" path="/var/lib/kubelet/pods/1fe9f135-f6a5-41f7-80e1-669507d408ce/volumes" Apr 17 08:24:15.400153 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:15.400107 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" podUID="cb684391-6aeb-41a8-a504-268ba4376434" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 08:24:25.399489 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:25.399440 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" podUID="cb684391-6aeb-41a8-a504-268ba4376434" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 08:24:35.399507 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:35.399467 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" podUID="cb684391-6aeb-41a8-a504-268ba4376434" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 08:24:39.236647 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:39.236614 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92"] Apr 17 08:24:39.237036 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:39.236877 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" podUID="51a08f91-43d7-44f2-ac02-5f1951a3fb98" containerName="kserve-container" containerID="cri-o://137ddc84e1c188a6b9c3e2313011c071a69078209d8447ec6cd2f7ef6ad62cb6" gracePeriod=30 Apr 17 08:24:39.384183 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:39.384145 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77"] Apr 17 08:24:39.384541 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:39.384527 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fe9f135-f6a5-41f7-80e1-669507d408ce" containerName="kserve-container" Apr 17 08:24:39.384541 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:39.384542 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe9f135-f6a5-41f7-80e1-669507d408ce" containerName="kserve-container" Apr 17 08:24:39.384625 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:39.384596 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fe9f135-f6a5-41f7-80e1-669507d408ce" containerName="kserve-container" Apr 17 08:24:39.388670 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:39.388651 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" Apr 17 08:24:39.397379 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:39.397352 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77"] Apr 17 08:24:39.399755 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:39.399730 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" Apr 17 08:24:39.522710 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:39.522603 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77"] Apr 17 08:24:39.526491 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:24:39.526462 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9868b43_3967_4ab1_aa78_76b17d95df70.slice/crio-5e5c76c6decdd3df53427d7c67f714fd7134c7cd72f615fb3fa8f2f4b9c77047 WatchSource:0}: Error finding container 5e5c76c6decdd3df53427d7c67f714fd7134c7cd72f615fb3fa8f2f4b9c77047: Status 404 returned error can't find the container with id 5e5c76c6decdd3df53427d7c67f714fd7134c7cd72f615fb3fa8f2f4b9c77047 Apr 17 08:24:40.501909 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:40.501874 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" event={"ID":"b9868b43-3967-4ab1-aa78-76b17d95df70","Type":"ContainerStarted","Data":"515ca05172bb24208e8945a300a00b303b972cc9705fc06056cbf68d9ac4bc20"} Apr 17 08:24:40.501909 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:40.501909 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" event={"ID":"b9868b43-3967-4ab1-aa78-76b17d95df70","Type":"ContainerStarted","Data":"5e5c76c6decdd3df53427d7c67f714fd7134c7cd72f615fb3fa8f2f4b9c77047"} Apr 17 08:24:40.502449 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:40.502057 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" Apr 17 08:24:40.503179 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:40.503153 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" podUID="b9868b43-3967-4ab1-aa78-76b17d95df70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 08:24:40.517619 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:40.517574 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" podStartSLOduration=1.517560323 podStartE2EDuration="1.517560323s" podCreationTimestamp="2026-04-17 08:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:24:40.516421795 +0000 UTC m=+1940.325432536" watchObservedRunningTime="2026-04-17 08:24:40.517560323 +0000 UTC m=+1940.326571053" Apr 17 08:24:41.505480 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:41.505441 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" podUID="b9868b43-3967-4ab1-aa78-76b17d95df70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 08:24:42.478750 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:42.478728 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" Apr 17 08:24:42.509351 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:42.509262 2570 generic.go:358] "Generic (PLEG): container finished" podID="51a08f91-43d7-44f2-ac02-5f1951a3fb98" containerID="137ddc84e1c188a6b9c3e2313011c071a69078209d8447ec6cd2f7ef6ad62cb6" exitCode=0 Apr 17 08:24:42.509736 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:42.509346 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" Apr 17 08:24:42.509736 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:42.509348 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" event={"ID":"51a08f91-43d7-44f2-ac02-5f1951a3fb98","Type":"ContainerDied","Data":"137ddc84e1c188a6b9c3e2313011c071a69078209d8447ec6cd2f7ef6ad62cb6"} Apr 17 08:24:42.509736 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:42.509465 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92" event={"ID":"51a08f91-43d7-44f2-ac02-5f1951a3fb98","Type":"ContainerDied","Data":"22dfe8ffd88e796563ea50def604b0510f7a67694777b8d30dd84a49cb8d9bf5"} Apr 17 08:24:42.509736 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:42.509479 2570 scope.go:117] "RemoveContainer" containerID="137ddc84e1c188a6b9c3e2313011c071a69078209d8447ec6cd2f7ef6ad62cb6" Apr 17 08:24:42.518248 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:42.518229 2570 scope.go:117] "RemoveContainer" containerID="137ddc84e1c188a6b9c3e2313011c071a69078209d8447ec6cd2f7ef6ad62cb6" Apr 17 08:24:42.518514 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:24:42.518493 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137ddc84e1c188a6b9c3e2313011c071a69078209d8447ec6cd2f7ef6ad62cb6\": container with ID starting with 137ddc84e1c188a6b9c3e2313011c071a69078209d8447ec6cd2f7ef6ad62cb6 not found: ID does not exist" containerID="137ddc84e1c188a6b9c3e2313011c071a69078209d8447ec6cd2f7ef6ad62cb6" Apr 17 08:24:42.518585 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:42.518525 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137ddc84e1c188a6b9c3e2313011c071a69078209d8447ec6cd2f7ef6ad62cb6"} err="failed to get container status \"137ddc84e1c188a6b9c3e2313011c071a69078209d8447ec6cd2f7ef6ad62cb6\": rpc error: code = NotFound desc = could not find container \"137ddc84e1c188a6b9c3e2313011c071a69078209d8447ec6cd2f7ef6ad62cb6\": container with ID starting with 137ddc84e1c188a6b9c3e2313011c071a69078209d8447ec6cd2f7ef6ad62cb6 not found: ID does not exist" Apr 17 08:24:42.529789 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:42.529768 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92"] Apr 17 08:24:42.534607 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:42.534585 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60721-predictor-85469d7b4f-mhw92"] Apr 17 08:24:42.829268 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:42.829195 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a08f91-43d7-44f2-ac02-5f1951a3fb98" path="/var/lib/kubelet/pods/51a08f91-43d7-44f2-ac02-5f1951a3fb98/volumes" Apr 17 08:24:45.399416 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:45.399373 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" podUID="cb684391-6aeb-41a8-a504-268ba4376434" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 08:24:51.505837 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:51.505790 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" podUID="b9868b43-3967-4ab1-aa78-76b17d95df70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 08:24:55.401133 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:24:55.401100 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" Apr 17 08:25:01.505772 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:01.505726 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" podUID="b9868b43-3967-4ab1-aa78-76b17d95df70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 08:25:11.506389 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:11.506284 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" podUID="b9868b43-3967-4ab1-aa78-76b17d95df70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 08:25:21.506468 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:21.506428 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" podUID="b9868b43-3967-4ab1-aa78-76b17d95df70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 08:25:23.709745 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:23.709707 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf"] Apr 17 08:25:23.710118 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:23.709948 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" podUID="cb684391-6aeb-41a8-a504-268ba4376434" containerName="kserve-container" containerID="cri-o://45a09a955c0d1808401859763524a25a7b366d34f29029dd2bd4041e03021f8f" gracePeriod=30 Apr 17 08:25:23.845558 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:23.845524 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft"] Apr 17 08:25:23.845877 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:23.845864 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51a08f91-43d7-44f2-ac02-5f1951a3fb98" containerName="kserve-container" Apr 17 08:25:23.845933 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:23.845880 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a08f91-43d7-44f2-ac02-5f1951a3fb98" containerName="kserve-container" Apr 17 08:25:23.845967 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:23.845935 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="51a08f91-43d7-44f2-ac02-5f1951a3fb98" containerName="kserve-container" Apr 17 08:25:23.849029 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:23.849008 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" Apr 17 08:25:23.856979 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:23.856496 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft"] Apr 17 08:25:23.860596 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:23.860571 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" Apr 17 08:25:23.985691 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:23.985668 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft"] Apr 17 08:25:23.988226 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:25:23.988198 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4de8432_25db_4204_8330_4c9c151e2824.slice/crio-52735d0273995d63f840ff990175cc9a8e84ca7543a9cfa8c6d4b34c1f1d7a43 WatchSource:0}: Error finding container 52735d0273995d63f840ff990175cc9a8e84ca7543a9cfa8c6d4b34c1f1d7a43: Status 404 returned error can't find the container with id 52735d0273995d63f840ff990175cc9a8e84ca7543a9cfa8c6d4b34c1f1d7a43 Apr 17 08:25:24.632023 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:24.631983 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" event={"ID":"d4de8432-25db-4204-8330-4c9c151e2824","Type":"ContainerStarted","Data":"bb5fed1bdd17ce341f4bdee9aa4dc312d16414da13d1250127d909783860429d"} Apr 17 08:25:24.632023 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:24.632020 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" event={"ID":"d4de8432-25db-4204-8330-4c9c151e2824","Type":"ContainerStarted","Data":"52735d0273995d63f840ff990175cc9a8e84ca7543a9cfa8c6d4b34c1f1d7a43"} Apr 17 08:25:24.632301 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:24.632192 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" Apr 17 08:25:24.633539 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:24.633517 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" podUID="d4de8432-25db-4204-8330-4c9c151e2824" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 17 08:25:24.645298 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:24.645247 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" podStartSLOduration=1.645231818 podStartE2EDuration="1.645231818s" podCreationTimestamp="2026-04-17 08:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:25:24.644691107 +0000 UTC m=+1984.453701849" watchObservedRunningTime="2026-04-17 08:25:24.645231818 +0000 UTC m=+1984.454242551" Apr 17 08:25:25.399426 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:25.399387 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" podUID="cb684391-6aeb-41a8-a504-268ba4376434" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 08:25:25.634480 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:25.634445 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" podUID="d4de8432-25db-4204-8330-4c9c151e2824" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 17 08:25:27.147909 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:27.147887 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" Apr 17 08:25:27.640251 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:27.640215 2570 generic.go:358] "Generic (PLEG): container finished" podID="cb684391-6aeb-41a8-a504-268ba4376434" containerID="45a09a955c0d1808401859763524a25a7b366d34f29029dd2bd4041e03021f8f" exitCode=0 Apr 17 08:25:27.640449 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:27.640277 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" Apr 17 08:25:27.640449 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:27.640299 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" event={"ID":"cb684391-6aeb-41a8-a504-268ba4376434","Type":"ContainerDied","Data":"45a09a955c0d1808401859763524a25a7b366d34f29029dd2bd4041e03021f8f"} Apr 17 08:25:27.640449 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:27.640363 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf" event={"ID":"cb684391-6aeb-41a8-a504-268ba4376434","Type":"ContainerDied","Data":"29de89446782b117978397bbf352f8f87c019cf4ac4efb6545146a20284bb227"} Apr 17 08:25:27.640449 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:27.640384 2570 scope.go:117] "RemoveContainer" containerID="45a09a955c0d1808401859763524a25a7b366d34f29029dd2bd4041e03021f8f" Apr 17 08:25:27.648234 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:27.648218 2570 scope.go:117] "RemoveContainer" containerID="45a09a955c0d1808401859763524a25a7b366d34f29029dd2bd4041e03021f8f" Apr 17 08:25:27.648493 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:25:27.648454 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a09a955c0d1808401859763524a25a7b366d34f29029dd2bd4041e03021f8f\": container with ID starting with 45a09a955c0d1808401859763524a25a7b366d34f29029dd2bd4041e03021f8f not found: ID does not exist" containerID="45a09a955c0d1808401859763524a25a7b366d34f29029dd2bd4041e03021f8f" Apr 17 08:25:27.648561 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:27.648499 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a09a955c0d1808401859763524a25a7b366d34f29029dd2bd4041e03021f8f"} err="failed to get container status \"45a09a955c0d1808401859763524a25a7b366d34f29029dd2bd4041e03021f8f\": rpc error: code = NotFound desc = could not find container \"45a09a955c0d1808401859763524a25a7b366d34f29029dd2bd4041e03021f8f\": container with ID starting with 45a09a955c0d1808401859763524a25a7b366d34f29029dd2bd4041e03021f8f not found: ID does not exist" Apr 17 08:25:27.658587 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:27.658560 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf"] Apr 17 08:25:27.662229 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:27.662205 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-00a8c-predictor-9b9db876f-7vskf"] Apr 17 08:25:28.828810 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:28.828781 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb684391-6aeb-41a8-a504-268ba4376434" path="/var/lib/kubelet/pods/cb684391-6aeb-41a8-a504-268ba4376434/volumes" Apr 17 08:25:31.506174 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:31.506143 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" Apr 17 08:25:35.635335 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:35.635273 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" podUID="d4de8432-25db-4204-8330-4c9c151e2824" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 17 08:25:45.634902 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:45.634854 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" podUID="d4de8432-25db-4204-8330-4c9c151e2824" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 17 08:25:55.634588 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:25:55.634543 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" podUID="d4de8432-25db-4204-8330-4c9c151e2824" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 17 08:26:05.635453 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:26:05.635408 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" podUID="d4de8432-25db-4204-8330-4c9c151e2824" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 17 08:26:15.636309 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:26:15.636268 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" Apr 17 08:34:48.676962 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:34:48.676929 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft"] Apr 17 08:34:48.679381 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:34:48.677181 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" podUID="d4de8432-25db-4204-8330-4c9c151e2824" containerName="kserve-container" containerID="cri-o://bb5fed1bdd17ce341f4bdee9aa4dc312d16414da13d1250127d909783860429d" gracePeriod=30 Apr 17 08:34:51.817854 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:34:51.817831 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" Apr 17 08:34:52.239926 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:34:52.239893 2570 generic.go:358] "Generic (PLEG): container finished" podID="d4de8432-25db-4204-8330-4c9c151e2824" containerID="bb5fed1bdd17ce341f4bdee9aa4dc312d16414da13d1250127d909783860429d" exitCode=0 Apr 17 08:34:52.240096 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:34:52.239960 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" Apr 17 08:34:52.240096 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:34:52.239965 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" event={"ID":"d4de8432-25db-4204-8330-4c9c151e2824","Type":"ContainerDied","Data":"bb5fed1bdd17ce341f4bdee9aa4dc312d16414da13d1250127d909783860429d"} Apr 17 08:34:52.240096 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:34:52.239997 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft" event={"ID":"d4de8432-25db-4204-8330-4c9c151e2824","Type":"ContainerDied","Data":"52735d0273995d63f840ff990175cc9a8e84ca7543a9cfa8c6d4b34c1f1d7a43"} Apr 17 08:34:52.240096 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:34:52.240017 2570 scope.go:117] "RemoveContainer" containerID="bb5fed1bdd17ce341f4bdee9aa4dc312d16414da13d1250127d909783860429d" Apr 17 08:34:52.247885 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:34:52.247864 2570 scope.go:117] "RemoveContainer" containerID="bb5fed1bdd17ce341f4bdee9aa4dc312d16414da13d1250127d909783860429d" Apr 17 08:34:52.248217 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:34:52.248194 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5fed1bdd17ce341f4bdee9aa4dc312d16414da13d1250127d909783860429d\": container with ID starting with bb5fed1bdd17ce341f4bdee9aa4dc312d16414da13d1250127d909783860429d not found: ID does not exist" containerID="bb5fed1bdd17ce341f4bdee9aa4dc312d16414da13d1250127d909783860429d" Apr 17 08:34:52.248376 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:34:52.248223 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5fed1bdd17ce341f4bdee9aa4dc312d16414da13d1250127d909783860429d"} err="failed to get container status \"bb5fed1bdd17ce341f4bdee9aa4dc312d16414da13d1250127d909783860429d\": rpc error: code = NotFound desc = could not find container \"bb5fed1bdd17ce341f4bdee9aa4dc312d16414da13d1250127d909783860429d\": container with ID starting with bb5fed1bdd17ce341f4bdee9aa4dc312d16414da13d1250127d909783860429d not found: ID does not exist" Apr 17 08:34:52.259555 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:34:52.259533 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft"] Apr 17 08:34:52.262661 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:34:52.262639 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32e35-predictor-bc6967c69-44xft"] Apr 17 08:34:52.828673 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:34:52.828628 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4de8432-25db-4204-8330-4c9c151e2824" path="/var/lib/kubelet/pods/d4de8432-25db-4204-8330-4c9c151e2824/volumes" Apr 17 08:42:08.879741 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:08.879704 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77"] Apr 17 08:42:08.882102 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:08.879938 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" podUID="b9868b43-3967-4ab1-aa78-76b17d95df70" containerName="kserve-container" containerID="cri-o://515ca05172bb24208e8945a300a00b303b972cc9705fc06056cbf68d9ac4bc20" gracePeriod=30 Apr 17 08:42:11.506532 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:11.506495 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" podUID="b9868b43-3967-4ab1-aa78-76b17d95df70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 08:42:11.621903 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:11.621880 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" Apr 17 08:42:12.474564 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:12.474528 2570 generic.go:358] "Generic (PLEG): container finished" podID="b9868b43-3967-4ab1-aa78-76b17d95df70" containerID="515ca05172bb24208e8945a300a00b303b972cc9705fc06056cbf68d9ac4bc20" exitCode=0 Apr 17 08:42:12.474729 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:12.474588 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" Apr 17 08:42:12.474729 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:12.474610 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" event={"ID":"b9868b43-3967-4ab1-aa78-76b17d95df70","Type":"ContainerDied","Data":"515ca05172bb24208e8945a300a00b303b972cc9705fc06056cbf68d9ac4bc20"} Apr 17 08:42:12.474729 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:12.474654 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77" event={"ID":"b9868b43-3967-4ab1-aa78-76b17d95df70","Type":"ContainerDied","Data":"5e5c76c6decdd3df53427d7c67f714fd7134c7cd72f615fb3fa8f2f4b9c77047"} Apr 17 08:42:12.474729 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:12.474669 2570 scope.go:117] "RemoveContainer" containerID="515ca05172bb24208e8945a300a00b303b972cc9705fc06056cbf68d9ac4bc20" Apr 17 08:42:12.482501 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:12.482482 2570 scope.go:117] "RemoveContainer" containerID="515ca05172bb24208e8945a300a00b303b972cc9705fc06056cbf68d9ac4bc20" Apr 17 08:42:12.482754 ip-10-0-138-233 kubenswrapper[2570]: E0417 08:42:12.482738 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"515ca05172bb24208e8945a300a00b303b972cc9705fc06056cbf68d9ac4bc20\": container with ID starting with 515ca05172bb24208e8945a300a00b303b972cc9705fc06056cbf68d9ac4bc20 not found: ID does not exist" containerID="515ca05172bb24208e8945a300a00b303b972cc9705fc06056cbf68d9ac4bc20" Apr 17 08:42:12.482807 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:12.482762 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"515ca05172bb24208e8945a300a00b303b972cc9705fc06056cbf68d9ac4bc20"} err="failed to get container status \"515ca05172bb24208e8945a300a00b303b972cc9705fc06056cbf68d9ac4bc20\": rpc error: code = NotFound desc = could not find container \"515ca05172bb24208e8945a300a00b303b972cc9705fc06056cbf68d9ac4bc20\": container with ID starting with 515ca05172bb24208e8945a300a00b303b972cc9705fc06056cbf68d9ac4bc20 not found: ID does not exist" Apr 17 08:42:12.493170 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:12.493150 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77"] Apr 17 08:42:12.497011 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:12.496992 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-54cf7-predictor-7bbd5dfff9-qxt77"] Apr 17 08:42:12.829202 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:12.829116 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9868b43-3967-4ab1-aa78-76b17d95df70" path="/var/lib/kubelet/pods/b9868b43-3967-4ab1-aa78-76b17d95df70/volumes" Apr 17 08:42:34.801979 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.801939 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gxzkb/must-gather-245r4"] Apr 17 08:42:34.802443 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.802254 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb684391-6aeb-41a8-a504-268ba4376434" containerName="kserve-container" Apr 17 08:42:34.802443 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.802265 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb684391-6aeb-41a8-a504-268ba4376434" containerName="kserve-container" Apr 17 08:42:34.802443 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.802283 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9868b43-3967-4ab1-aa78-76b17d95df70" containerName="kserve-container" Apr 17 08:42:34.802443 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.802290 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9868b43-3967-4ab1-aa78-76b17d95df70" containerName="kserve-container" Apr 17 08:42:34.802443 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.802297 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4de8432-25db-4204-8330-4c9c151e2824" containerName="kserve-container" Apr 17 08:42:34.802443 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.802303 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4de8432-25db-4204-8330-4c9c151e2824" containerName="kserve-container" Apr 17 08:42:34.802443 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.802370 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb684391-6aeb-41a8-a504-268ba4376434" containerName="kserve-container" Apr 17 08:42:34.802443 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.802377 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4de8432-25db-4204-8330-4c9c151e2824" containerName="kserve-container" Apr 17 08:42:34.802443 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.802385 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9868b43-3967-4ab1-aa78-76b17d95df70" containerName="kserve-container" Apr 17 08:42:34.805278 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.805261 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gxzkb/must-gather-245r4" Apr 17 08:42:34.807722 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.807691 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gxzkb\"/\"kube-root-ca.crt\"" Apr 17 08:42:34.807722 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.807713 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gxzkb\"/\"openshift-service-ca.crt\"" Apr 17 08:42:34.808347 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.808309 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gxzkb\"/\"default-dockercfg-pdhdf\"" Apr 17 08:42:34.812955 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.812933 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gxzkb/must-gather-245r4"] Apr 17 08:42:34.847638 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.847604 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd7nt\" (UniqueName: \"kubernetes.io/projected/3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be-kube-api-access-fd7nt\") pod \"must-gather-245r4\" (UID: \"3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be\") " pod="openshift-must-gather-gxzkb/must-gather-245r4" Apr 17 08:42:34.847821 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.847653 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be-must-gather-output\") pod \"must-gather-245r4\" (UID: \"3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be\") " pod="openshift-must-gather-gxzkb/must-gather-245r4" Apr 17 08:42:34.948961 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.948923 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd7nt\" (UniqueName: \"kubernetes.io/projected/3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be-kube-api-access-fd7nt\") pod \"must-gather-245r4\" (UID: \"3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be\") " pod="openshift-must-gather-gxzkb/must-gather-245r4" Apr 17 08:42:34.949156 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.948973 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be-must-gather-output\") pod \"must-gather-245r4\" (UID: \"3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be\") " pod="openshift-must-gather-gxzkb/must-gather-245r4" Apr 17 08:42:34.949432 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.949413 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be-must-gather-output\") pod \"must-gather-245r4\" (UID: \"3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be\") " pod="openshift-must-gather-gxzkb/must-gather-245r4" Apr 17 08:42:34.957188 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:34.957165 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd7nt\" (UniqueName: \"kubernetes.io/projected/3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be-kube-api-access-fd7nt\") pod \"must-gather-245r4\" (UID: \"3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be\") " pod="openshift-must-gather-gxzkb/must-gather-245r4" Apr 17 08:42:35.114381 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:35.114274 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gxzkb/must-gather-245r4" Apr 17 08:42:35.235495 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:35.235463 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gxzkb/must-gather-245r4"] Apr 17 08:42:35.238340 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:42:35.238298 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e1097c4_7ba0_4f2f_9f12_1c2570eaf3be.slice/crio-1304406726cc579d3733b9f842384dd422455a51a59a45429f9726a81038c599 WatchSource:0}: Error finding container 1304406726cc579d3733b9f842384dd422455a51a59a45429f9726a81038c599: Status 404 returned error can't find the container with id 1304406726cc579d3733b9f842384dd422455a51a59a45429f9726a81038c599 Apr 17 08:42:35.240161 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:35.240145 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:42:35.538201 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:35.538161 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gxzkb/must-gather-245r4" event={"ID":"3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be","Type":"ContainerStarted","Data":"1304406726cc579d3733b9f842384dd422455a51a59a45429f9726a81038c599"} Apr 17 08:42:36.543437 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:36.543403 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gxzkb/must-gather-245r4" event={"ID":"3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be","Type":"ContainerStarted","Data":"5ed24dc6cd98dc86876c5fd51a3793dc00685da020f08ed7b20b12bbe244b95a"} Apr 17 08:42:36.543437 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:36.543444 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gxzkb/must-gather-245r4" event={"ID":"3e1097c4-7ba0-4f2f-9f12-1c2570eaf3be","Type":"ContainerStarted","Data":"c82fe1637690d1d4ff8db030e7f9d2937eb36c7cb118fc508362c1b94f4289e4"} Apr 17 08:42:37.489120 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:37.489092 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-x9bnq_746ed911-6342-422a-910a-f742c65c2879/global-pull-secret-syncer/0.log" Apr 17 08:42:37.540252 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:37.540223 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4z55l_34e79f54-de56-4c95-814b-f41296cabe3d/konnectivity-agent/0.log" Apr 17 08:42:37.676302 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:37.676269 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-233.ec2.internal_169311e9e20ae28d37e18c3438cd6482/haproxy/0.log" Apr 17 08:42:41.151046 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:41.151004 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-8cf54c56c-5tfr4_9e93a016-654f-4003-b4eb-45f420b5b7ec/metrics-server/0.log" Apr 17 08:42:41.185850 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:41.185806 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-8d8lp_ab3c33d3-4737-4053-8863-52fa1498a50d/monitoring-plugin/0.log" Apr 17 08:42:41.222893 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:41.222858 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2bbh6_c534ab59-8b06-4344-a4a6-61cdbdfe5340/node-exporter/0.log" Apr 17 08:42:41.244381 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:41.244353 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2bbh6_c534ab59-8b06-4344-a4a6-61cdbdfe5340/kube-rbac-proxy/0.log" Apr 17 08:42:41.263722 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:41.263696 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2bbh6_c534ab59-8b06-4344-a4a6-61cdbdfe5340/init-textfile/0.log" Apr 17 08:42:42.250703 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:42.250660 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ccbbc7cfd-7pkwd_56926dfe-e5dc-4b01-abb6-98d9a1516d23/thanos-query/0.log" Apr 17 08:42:42.288838 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:42.288804 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ccbbc7cfd-7pkwd_56926dfe-e5dc-4b01-abb6-98d9a1516d23/kube-rbac-proxy-web/0.log" Apr 17 08:42:42.327352 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:42.327306 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ccbbc7cfd-7pkwd_56926dfe-e5dc-4b01-abb6-98d9a1516d23/kube-rbac-proxy/0.log" Apr 17 08:42:42.359664 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:42.359639 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ccbbc7cfd-7pkwd_56926dfe-e5dc-4b01-abb6-98d9a1516d23/prom-label-proxy/0.log" Apr 17 08:42:42.394129 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:42.394100 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ccbbc7cfd-7pkwd_56926dfe-e5dc-4b01-abb6-98d9a1516d23/kube-rbac-proxy-rules/0.log" Apr 17 08:42:42.445708 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:42.445679 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ccbbc7cfd-7pkwd_56926dfe-e5dc-4b01-abb6-98d9a1516d23/kube-rbac-proxy-metrics/0.log" Apr 17 08:42:44.122644 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.122611 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f67f789-dvtdm_c35a3b2d-f227-49e1-88e9-12fd203c04d8/console/0.log" Apr 17 08:42:44.550090 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.550024 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-dz2t5_b20c27f2-f742-43ff-a651-7ac3dbcd83d8/volume-data-source-validator/0.log" Apr 17 08:42:44.556131 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.556081 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gxzkb/must-gather-245r4" podStartSLOduration=9.780510457 podStartE2EDuration="10.556065848s" podCreationTimestamp="2026-04-17 08:42:34 +0000 UTC" firstStartedPulling="2026-04-17 08:42:35.2402684 +0000 UTC m=+3015.049279108" lastFinishedPulling="2026-04-17 08:42:36.015823774 +0000 UTC m=+3015.824834499" observedRunningTime="2026-04-17 08:42:36.558155957 +0000 UTC m=+3016.367166682" watchObservedRunningTime="2026-04-17 08:42:44.556065848 +0000 UTC m=+3024.365076581" Apr 17 08:42:44.556516 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.556500 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q"] Apr 17 08:42:44.560860 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.560836 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.570555 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.570531 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q"] Apr 17 08:42:44.627521 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.627214 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/727793c9-51a3-4aee-91af-387baf334d1f-proc\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.627521 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.627287 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/727793c9-51a3-4aee-91af-387baf334d1f-podres\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.627521 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.627358 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/727793c9-51a3-4aee-91af-387baf334d1f-sys\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.627521 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.627385 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/727793c9-51a3-4aee-91af-387baf334d1f-lib-modules\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.627521 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.627436 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q22t\" (UniqueName: \"kubernetes.io/projected/727793c9-51a3-4aee-91af-387baf334d1f-kube-api-access-9q22t\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.729490 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.728538 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/727793c9-51a3-4aee-91af-387baf334d1f-sys\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.729490 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.728585 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/727793c9-51a3-4aee-91af-387baf334d1f-lib-modules\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.729490 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.728646 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9q22t\" (UniqueName: \"kubernetes.io/projected/727793c9-51a3-4aee-91af-387baf334d1f-kube-api-access-9q22t\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.729490 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.728692 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/727793c9-51a3-4aee-91af-387baf334d1f-proc\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.729490 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.728749 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/727793c9-51a3-4aee-91af-387baf334d1f-podres\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.729490 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.728910 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/727793c9-51a3-4aee-91af-387baf334d1f-podres\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.729490 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.728976 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/727793c9-51a3-4aee-91af-387baf334d1f-sys\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.729490 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.729061 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/727793c9-51a3-4aee-91af-387baf334d1f-lib-modules\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.729490 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.729446 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/727793c9-51a3-4aee-91af-387baf334d1f-proc\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.736402 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.736370 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q22t\" (UniqueName: \"kubernetes.io/projected/727793c9-51a3-4aee-91af-387baf334d1f-kube-api-access-9q22t\") pod \"perf-node-gather-daemonset-84x2q\" (UID: \"727793c9-51a3-4aee-91af-387baf334d1f\") " pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:44.871906 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:44.871871 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:45.012680 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:45.012650 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q"] Apr 17 08:42:45.016778 ip-10-0-138-233 kubenswrapper[2570]: W0417 08:42:45.016746 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod727793c9_51a3_4aee_91af_387baf334d1f.slice/crio-a8efa892b4daf5314cecd4bc8c748c7a6de15a578566093b73c881d6566255c6 WatchSource:0}: Error finding container a8efa892b4daf5314cecd4bc8c748c7a6de15a578566093b73c881d6566255c6: Status 404 returned error can't find the container with id a8efa892b4daf5314cecd4bc8c748c7a6de15a578566093b73c881d6566255c6 Apr 17 08:42:45.277138 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:45.277076 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vrvr9_6e47d03e-fde2-4cd8-96cd-ec060cb7afb1/dns/0.log" Apr 17 08:42:45.295704 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:45.295668 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vrvr9_6e47d03e-fde2-4cd8-96cd-ec060cb7afb1/kube-rbac-proxy/0.log" Apr 17 08:42:45.315744 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:45.315717 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5v2tv_236582ca-8db3-417e-b2cf-dc0053b8afcf/dns-node-resolver/0.log" Apr 17 08:42:45.577107 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:45.577031 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" event={"ID":"727793c9-51a3-4aee-91af-387baf334d1f","Type":"ContainerStarted","Data":"a461e8cf283b5d5cb84a36bfd5ea6f311fa3b90091335e9700a413a8416d6613"} Apr 17 08:42:45.577107 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:45.577072 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" event={"ID":"727793c9-51a3-4aee-91af-387baf334d1f","Type":"ContainerStarted","Data":"a8efa892b4daf5314cecd4bc8c748c7a6de15a578566093b73c881d6566255c6"} Apr 17 08:42:45.577278 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:45.577152 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:45.593546 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:45.593499 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" podStartSLOduration=1.593484369 podStartE2EDuration="1.593484369s" podCreationTimestamp="2026-04-17 08:42:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:42:45.591417337 +0000 UTC m=+3025.400428065" watchObservedRunningTime="2026-04-17 08:42:45.593484369 +0000 UTC m=+3025.402495098" Apr 17 08:42:45.760831 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:45.760803 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4tp7h_c6a74de4-3b80-4092-8de3-2a795216aa48/node-ca/0.log" Apr 17 08:42:46.824249 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:46.824221 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qnv6r_77b6c827-22ca-45dc-8ce4-0267c31539a0/serve-healthcheck-canary/0.log" Apr 17 08:42:47.318623 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:47.318591 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7n7vn_bbf30a91-db8c-4bc8-a486-9fa0c9c9b449/kube-rbac-proxy/0.log" Apr 17 08:42:47.337325 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:47.337285 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7n7vn_bbf30a91-db8c-4bc8-a486-9fa0c9c9b449/exporter/0.log" Apr 17 08:42:47.356172 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:47.356145 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7n7vn_bbf30a91-db8c-4bc8-a486-9fa0c9c9b449/extractor/0.log" Apr 17 08:42:49.197833 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:49.197782 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-558564fd68-d94vb_5599dd9d-d9a4-43d4-97d4-ebc58704ae98/manager/0.log" Apr 17 08:42:49.503087 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:49.503016 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-6w5g2_ee5a709a-57ed-4987-8fbb-f46c9e961e33/manager/0.log" Apr 17 08:42:49.550815 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:49.550789 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-pjck4_a937ea63-07c0-4da3-a786-5a086c699abf/seaweedfs/0.log" Apr 17 08:42:51.591202 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:51.591172 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gxzkb/perf-node-gather-daemonset-84x2q" Apr 17 08:42:53.139343 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:53.139301 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-sbpvg_e854e687-08d0-4a97-876c-03bc44afc12b/migrator/0.log" Apr 17 08:42:53.158778 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:53.158754 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-sbpvg_e854e687-08d0-4a97-876c-03bc44afc12b/graceful-termination/0.log" Apr 17 08:42:54.546503 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:54.546471 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5r6l7_6eb6689c-0d36-4771-bcac-8118455cada4/kube-multus/0.log" Apr 17 08:42:54.738797 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:54.738719 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jgncd_b042621c-fbf8-4739-b758-0e481535b940/kube-multus-additional-cni-plugins/0.log" Apr 17 08:42:54.758651 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:54.758622 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jgncd_b042621c-fbf8-4739-b758-0e481535b940/egress-router-binary-copy/0.log" Apr 17 08:42:54.778549 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:54.778526 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jgncd_b042621c-fbf8-4739-b758-0e481535b940/cni-plugins/0.log" Apr 17 08:42:54.797952 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:54.797922 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jgncd_b042621c-fbf8-4739-b758-0e481535b940/bond-cni-plugin/0.log" Apr 17 08:42:54.817843 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:54.817821 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jgncd_b042621c-fbf8-4739-b758-0e481535b940/routeoverride-cni/0.log" Apr 17 08:42:54.845257 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:54.845233 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jgncd_b042621c-fbf8-4739-b758-0e481535b940/whereabouts-cni-bincopy/0.log" Apr 17 08:42:54.865912 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:54.865888 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jgncd_b042621c-fbf8-4739-b758-0e481535b940/whereabouts-cni/0.log" Apr 17 08:42:55.067742 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:55.067647 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7bs5q_b124ed47-021a-4bde-8c03-dcfce0f301d8/network-metrics-daemon/0.log" Apr 17 08:42:55.085378 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:55.085345 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7bs5q_b124ed47-021a-4bde-8c03-dcfce0f301d8/kube-rbac-proxy/0.log" Apr 17 08:42:56.276674 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:56.276597 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x6ppg_21b5c294-9caa-41ea-8bd0-357a8981ec9b/ovn-controller/0.log" Apr 17 08:42:56.307569 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:56.307542 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x6ppg_21b5c294-9caa-41ea-8bd0-357a8981ec9b/ovn-acl-logging/0.log" Apr 17 08:42:56.328573 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:56.328548 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x6ppg_21b5c294-9caa-41ea-8bd0-357a8981ec9b/kube-rbac-proxy-node/0.log" Apr 17 08:42:56.347154 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:56.347124 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x6ppg_21b5c294-9caa-41ea-8bd0-357a8981ec9b/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 08:42:56.363083 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:56.363052 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x6ppg_21b5c294-9caa-41ea-8bd0-357a8981ec9b/northd/0.log" Apr 17 08:42:56.383055 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:56.383024 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x6ppg_21b5c294-9caa-41ea-8bd0-357a8981ec9b/nbdb/0.log" Apr 17 08:42:56.402488 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:56.402460 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x6ppg_21b5c294-9caa-41ea-8bd0-357a8981ec9b/sbdb/0.log" Apr 17 08:42:56.511819 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:56.511788 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x6ppg_21b5c294-9caa-41ea-8bd0-357a8981ec9b/ovnkube-controller/0.log" Apr 17 08:42:57.759709 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:57.759658 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-dh54x_7fe9b09c-633f-48ec-916f-04b365b73fcb/network-check-target-container/0.log" Apr 17 08:42:58.691051 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:58.691027 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xl7sg_18767c55-78c8-48a0-ae7f-c2c09ebac544/iptables-alerter/0.log" Apr 17 08:42:59.330597 ip-10-0-138-233 kubenswrapper[2570]: I0417 08:42:59.330553 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-wh74p_02603763-8df0-4492-8876-283310358655/tuned/0.log"