Apr 16 20:09:07.856910 ip-10-0-135-182 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 20:09:07.856920 ip-10-0-135-182 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 20:09:07.856928 ip-10-0-135-182 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 20:09:07.857148 ip-10-0-135-182 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 20:09:17.919304 ip-10-0-135-182 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 20:09:17.919322 ip-10-0-135-182 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 4b767ec7b2a14ae98747fb1709784e99 -- Apr 16 20:11:52.222511 ip-10-0-135-182 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:11:52.683998 ip-10-0-135-182 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:52.683998 ip-10-0-135-182 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:11:52.683998 ip-10-0-135-182 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:52.683998 ip-10-0-135-182 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:11:52.683998 ip-10-0-135-182 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:52.687305 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.687189 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:11:52.692300 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692274 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:52.692300 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692295 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:52.692300 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692300 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:52.692300 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692305 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692310 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692315 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692320 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692324 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692328 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692331 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692335 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692339 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692343 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692346 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692350 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692354 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692359 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692362 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692366 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692370 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692374 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692378 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692382 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:52.692551 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692385 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692389 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692393 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692397 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692401 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692405 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692409 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692413 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692417 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692421 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692426 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692430 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692434 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692438 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692444 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692448 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692452 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692457 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692461 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692465 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:52.693299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692469 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692473 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692476 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692481 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692485 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692489 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692493 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692501 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692506 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692510 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692515 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692520 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692524 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692530 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692534 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692539 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692543 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692547 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692552 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:52.693799 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692555 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692560 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692564 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692568 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692573 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692579 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692583 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692588 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692596 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692603 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692608 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692613 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692617 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692621 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692625 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692630 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692635 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692640 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692644 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692648 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:52.694367 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692653 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692657 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692662 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.692666 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694503 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694516 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694523 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694528 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694532 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694536 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694540 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694544 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694549 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694553 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694557 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694561 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694565 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694569 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694573 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694578 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:52.695204 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694582 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694586 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694590 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694594 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694598 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694602 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694606 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694611 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694623 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694628 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694632 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694636 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694640 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694647 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694654 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694660 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694664 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694669 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694674 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:52.696065 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694679 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694684 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694688 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694693 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694697 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694702 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694706 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694712 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694716 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694721 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694725 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694729 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694733 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694737 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694742 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694746 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694750 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694755 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694759 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694763 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:52.696737 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694768 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694773 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694785 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694791 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694798 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694802 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694806 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694811 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694816 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694820 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694824 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694828 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694833 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694837 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694841 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694845 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694849 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694853 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694859 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694864 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:52.697248 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694868 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694872 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694876 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694880 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694885 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694889 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694894 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694898 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694902 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694906 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.694910 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695019 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695029 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695043 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695050 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695064 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695070 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695078 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695085 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695090 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695095 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695101 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:11:52.697860 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695106 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695111 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695116 2569 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695120 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695125 2569 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695130 2569 flags.go:64] FLAG: --cloud-config="" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695135 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695139 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695152 2569 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695157 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695162 2569 flags.go:64] FLAG: --config-dir="" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695167 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695172 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695178 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695183 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695188 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695206 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695228 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695233 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695238 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695243 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695248 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695256 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695260 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695264 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:11:52.698632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695269 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695283 2569 flags.go:64] FLAG: --enable-server="true" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695288 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695298 2569 flags.go:64] FLAG: --event-burst="100" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695304 2569 flags.go:64] FLAG: --event-qps="50" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695309 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695314 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695319 2569 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695325 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695330 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695335 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695339 2569 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695345 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695349 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695355 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695359 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695364 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695369 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695373 2569 flags.go:64] FLAG: --feature-gates="" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695379 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695385 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695390 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695395 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695403 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695409 2569 flags.go:64] FLAG: --help="false" Apr 16 20:11:52.699290 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695413 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-135-182.ec2.internal" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695418 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695423 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695428 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695434 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695440 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695445 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695449 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695454 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695460 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695466 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695472 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695477 2569 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695482 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695486 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695491 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695496 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695500 2569 flags.go:64] FLAG: --lock-file="" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695505 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695510 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695515 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695523 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695528 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695533 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:11:52.699991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695538 2569 flags.go:64] FLAG: --logging-format="text" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695542 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695548 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695553 2569 flags.go:64] FLAG: --manifest-url="" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695557 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695564 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695571 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695577 2569 flags.go:64] FLAG: --max-pods="110" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695582 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695587 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695592 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695596 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695601 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695606 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695611 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695623 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695628 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695632 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695639 2569 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695644 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695652 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695657 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695662 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695666 2569 flags.go:64] FLAG: --port="10250" Apr 16 20:11:52.700603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695671 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695676 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0103046f2a50a4c01" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695681 2569 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695686 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695691 2569 flags.go:64] FLAG: --register-node="true" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695695 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695700 2569 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695706 2569 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695710 2569 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695715 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695720 2569 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695726 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695731 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695736 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695741 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695747 2569 flags.go:64] FLAG: --runonce="false" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695752 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695758 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695763 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695768 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695772 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695777 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695782 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695787 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695791 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695796 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:11:52.701196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695801 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695808 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695813 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695818 2569 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695822 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695831 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695836 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695840 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695848 2569 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695853 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695857 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695862 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695867 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695871 2569 flags.go:64] FLAG: --v="2" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695878 2569 flags.go:64] FLAG: --version="false" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695885 2569 flags.go:64] FLAG: --vmodule="" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695892 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.695897 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696047 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696055 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696060 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696066 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696070 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696074 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:52.701851 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696078 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696082 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696086 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696091 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696095 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696099 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696103 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696108 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696112 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696116 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696124 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696128 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696132 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696136 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696141 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696145 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696149 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696153 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696158 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696163 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:52.702437 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696167 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696171 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696175 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696179 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696183 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696187 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696192 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696196 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696200 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696205 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696209 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696231 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696236 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696240 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696245 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696249 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696254 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696258 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696262 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696267 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:52.702993 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696271 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696275 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696280 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696285 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696289 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696294 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696300 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696306 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696311 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696315 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696319 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696323 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696328 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696332 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696336 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696341 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696345 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696350 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696354 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:52.703511 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696359 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696363 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696367 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696371 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696377 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696382 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696386 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696390 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696394 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696398 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696402 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696406 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696411 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696415 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696420 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696425 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696429 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696434 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696438 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:52.704062 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696442 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.696448 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.697423 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.704065 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.704175 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704243 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704248 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704252 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704256 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704260 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704263 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704266 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704269 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704272 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704275 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:52.704550 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704279 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704283 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704286 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704289 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704292 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704295 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704298 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704300 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704303 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704306 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704309 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704312 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704314 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704317 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704319 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704322 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704324 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704327 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704329 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:52.704925 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704332 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704335 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704340 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704343 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704345 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704348 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704351 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704353 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704356 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704358 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704362 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704364 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704367 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704369 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704372 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704374 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704377 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704380 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704382 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704385 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:52.705402 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704388 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704390 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704393 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704395 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704397 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704400 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704403 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704405 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704407 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704410 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704412 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704415 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704417 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704420 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704423 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704429 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704432 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704434 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704437 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704441 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704443 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:52.705900 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704446 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704448 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704451 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704453 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704456 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704459 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704462 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704464 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704467 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704469 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704473 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704477 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704480 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704482 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704485 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:52.706482 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704487 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.704493 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704592 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704597 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704600 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704603 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704605 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704609 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704613 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704616 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704619 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704622 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704625 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704628 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704630 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:52.706854 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704633 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704635 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704638 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704640 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704643 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704645 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704648 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704651 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704654 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704656 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704659 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704661 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704664 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704666 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704669 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704671 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704674 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704678 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704680 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704683 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:52.707258 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704685 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704688 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704691 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704693 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704696 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704698 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704701 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704703 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704706 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704709 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704712 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704714 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704717 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704720 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704722 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704724 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704727 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704730 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704732 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704735 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:52.707778 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704737 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704740 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704742 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704745 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704747 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704750 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704752 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704755 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704757 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704760 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704762 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704765 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704768 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704770 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704773 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704776 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704778 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704781 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704783 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704786 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:52.708270 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704789 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704791 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704794 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704797 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704800 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704802 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704805 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704807 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704809 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704812 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704814 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704817 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:52.704819 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.704824 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:52.708758 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.705490 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:11:52.710243 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.710230 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:11:52.711203 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.711186 2569 server.go:1019] "Starting client certificate rotation" Apr 16 20:11:52.711314 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.711298 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:52.711355 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.711340 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:52.737735 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.737716 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:52.741598 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.741581 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:52.755714 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.755696 2569 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:11:52.761320 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.761302 2569 log.go:25] "Validated CRI v1 image API" Apr 16 20:11:52.762373 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.762357 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:11:52.766548 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.766526 2569 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 be004400-a362-4dec-85ac-bf0e8906a381:/dev/nvme0n1p4 fa9cd21d-5ddb-4ab6-bfa7-30aaa2e0c47c:/dev/nvme0n1p3] Apr 16 20:11:52.766608 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.766547 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:11:52.767884 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.767867 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:52.772665 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.772561 2569 manager.go:217] Machine: {Timestamp:2026-04-16 20:11:52.770376937 +0000 UTC m=+0.431974417 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100395 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a575490c79a68e3599efb0ffd7483 SystemUUID:ec2a5754-90c7-9a68-e359-9efb0ffd7483 BootID:4b767ec7-b2a1-4ae9-8747-fb1709784e99 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:11:11:ca:84:33 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:11:11:ca:84:33 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c2:b7:fe:b5:c7:53 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:11:52.772665 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.772661 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:11:52.772801 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.772789 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:11:52.773769 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.773741 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:11:52.773916 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.773772 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-182.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:11:52.773962 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.773926 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:11:52.773962 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.773935 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:11:52.773962 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.773948 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:52.774047 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.773966 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:52.775272 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.775261 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:52.775382 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.775373 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:11:52.777858 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.777848 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:11:52.777894 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.777863 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:11:52.777894 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.777875 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:11:52.777894 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.777884 2569 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:11:52.777974 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.777902 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:11:52.779031 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.779018 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:52.779092 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.779037 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:52.782882 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.782866 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:11:52.785096 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.785083 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:11:52.786684 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.786670 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:11:52.786729 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.786692 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:11:52.786729 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.786699 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:11:52.786729 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.786704 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:11:52.786729 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.786710 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:11:52.786729 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.786718 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:11:52.786729 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.786725 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:11:52.786878 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.786737 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:11:52.786878 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.786749 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:11:52.786878 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.786758 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:11:52.786878 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.786770 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:11:52.786878 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.786778 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:11:52.787495 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.787486 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:11:52.787495 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.787495 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:11:52.788335 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.788306 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-g5vrl" Apr 16 20:11:52.791100 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.791087 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:11:52.791179 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.791123 2569 server.go:1295] "Started kubelet" Apr 16 20:11:52.791289 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.791247 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:11:52.791339 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.791306 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:11:52.791339 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.791292 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:11:52.792071 ip-10-0-135-182 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:11:52.793007 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.792986 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:11:52.793704 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.793683 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-182.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:11:52.800518 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.800503 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:11:52.812087 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.812067 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:11:52.812277 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.812256 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-g5vrl" Apr 16 20:11:52.812483 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.812463 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:52.813164 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.813146 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:11:52.813164 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.813163 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:11:52.813294 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.813144 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:11:52.813372 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.813357 2569 factory.go:55] Registering systemd factory Apr 16 20:11:52.813446 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.813422 2569 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:11:52.813609 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:52.813526 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-182.ec2.internal\" not found" Apr 16 20:11:52.813676 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.813364 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:11:52.813676 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.813646 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:11:52.813765 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.813723 2569 factory.go:153] Registering CRI-O factory Apr 16 20:11:52.813765 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.813756 2569 factory.go:223] Registration of the crio container factory successfully Apr 16 20:11:52.814086 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.813941 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:11:52.814086 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.814008 2569 factory.go:103] Registering Raw factory Apr 16 20:11:52.814086 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.814029 2569 manager.go:1196] Started watching for new ooms in manager Apr 16 20:11:52.814612 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.814583 2569 manager.go:319] Starting recovery of all containers Apr 16 20:11:52.815350 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:52.815324 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 20:11:52.815450 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:52.815328 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:11:52.815828 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:52.815809 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-182.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:11:52.816686 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:52.815619 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-182.ec2.internal.18a6ef634b768759 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-182.ec2.internal,UID:ip-10-0-135-182.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-182.ec2.internal,},FirstTimestamp:2026-04-16 20:11:52.791099225 +0000 UTC m=+0.452696725,LastTimestamp:2026-04-16 20:11:52.791099225 +0000 UTC m=+0.452696725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-182.ec2.internal,}" Apr 16 20:11:52.818776 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:52.818649 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:11:52.822724 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:52.822705 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-182.ec2.internal\" not found" node="ip-10-0-135-182.ec2.internal" Apr 16 20:11:52.825162 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.825150 2569 manager.go:324] Recovery completed Apr 16 20:11:52.828756 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.828742 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:52.831662 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.831647 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:52.831731 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.831678 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:52.831731 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.831688 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:52.832154 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.832136 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:11:52.832154 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.832153 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:11:52.832283 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.832176 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:52.834802 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.834791 2569 policy_none.go:49] "None policy: Start" Apr 16 20:11:52.834837 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.834806 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:11:52.834837 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.834815 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:11:52.869790 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.869773 2569 manager.go:341] "Starting Device Plugin manager" Apr 16 20:11:52.884602 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:52.869838 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:11:52.884602 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.869851 2569 server.go:85] "Starting device plugin registration server" Apr 16 20:11:52.884602 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.870075 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:11:52.884602 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.870085 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:11:52.884602 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.870180 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:11:52.884602 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.870299 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:11:52.884602 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.870310 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:11:52.884602 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:52.871086 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:11:52.884602 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:52.871124 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-182.ec2.internal\" not found" Apr 16 20:11:52.943780 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.943704 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:11:52.944924 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.944906 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:11:52.945026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.944933 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:11:52.945026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.944952 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:11:52.945026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.944960 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:11:52.945145 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:52.945044 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:11:52.948991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.948970 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:52.970831 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.970815 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:52.972956 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.972937 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:52.973036 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.972971 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:52.973036 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.972985 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:52.973036 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.973009 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-182.ec2.internal" Apr 16 20:11:52.980573 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:52.980560 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-182.ec2.internal" Apr 16 20:11:52.980618 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:52.980581 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-182.ec2.internal\": node \"ip-10-0-135-182.ec2.internal\" not found" Apr 16 20:11:52.998820 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:52.998796 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-182.ec2.internal\" not found" Apr 16 20:11:53.046113 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.046067 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-182.ec2.internal"] Apr 16 20:11:53.046266 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.046171 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:53.047174 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.047158 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:53.047278 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.047188 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:53.047278 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.047207 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:53.048430 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.048418 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:53.048569 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.048556 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.048604 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.048585 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:53.049261 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.049248 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:53.049329 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.049261 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:53.049329 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.049275 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:53.049329 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.049284 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:53.049329 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.049286 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:53.049329 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.049295 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:53.050427 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.050413 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.050513 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.050436 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:53.051099 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.051084 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:53.051161 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.051109 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:53.051161 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.051122 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:53.075935 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:53.075912 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-182.ec2.internal\" not found" node="ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.080158 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:53.080142 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-182.ec2.internal\" not found" node="ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.098916 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:53.098898 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-182.ec2.internal\" not found" Apr 16 20:11:53.199117 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:53.199017 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-182.ec2.internal\" not found" Apr 16 20:11:53.214344 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.214320 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b83001bd5a05036cf4903854c418418b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal\" (UID: \"b83001bd5a05036cf4903854c418418b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.214438 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.214350 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b83001bd5a05036cf4903854c418418b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal\" (UID: \"b83001bd5a05036cf4903854c418418b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.214438 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.214377 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a31b11c9cf44c8a66bd679136b463d4-config\") pod \"kube-apiserver-proxy-ip-10-0-135-182.ec2.internal\" (UID: \"2a31b11c9cf44c8a66bd679136b463d4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.299811 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:53.299778 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-182.ec2.internal\" not found" Apr 16 20:11:53.315056 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.315032 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b83001bd5a05036cf4903854c418418b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal\" (UID: \"b83001bd5a05036cf4903854c418418b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.315135 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.315062 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b83001bd5a05036cf4903854c418418b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal\" (UID: \"b83001bd5a05036cf4903854c418418b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.315135 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.315083 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a31b11c9cf44c8a66bd679136b463d4-config\") pod \"kube-apiserver-proxy-ip-10-0-135-182.ec2.internal\" (UID: \"2a31b11c9cf44c8a66bd679136b463d4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.315135 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.315124 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a31b11c9cf44c8a66bd679136b463d4-config\") pod \"kube-apiserver-proxy-ip-10-0-135-182.ec2.internal\" (UID: \"2a31b11c9cf44c8a66bd679136b463d4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.315288 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.315161 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b83001bd5a05036cf4903854c418418b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal\" (UID: \"b83001bd5a05036cf4903854c418418b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.315288 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.315157 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b83001bd5a05036cf4903854c418418b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal\" (UID: \"b83001bd5a05036cf4903854c418418b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.378225 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.378173 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.381855 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.381834 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.400279 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:53.400255 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-182.ec2.internal\" not found" Apr 16 20:11:53.500880 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:53.500792 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-182.ec2.internal\" not found" Apr 16 20:11:53.601320 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:53.601279 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-182.ec2.internal\" not found" Apr 16 20:11:53.701842 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:53.701808 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-182.ec2.internal\" not found" Apr 16 20:11:53.710980 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.710962 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:11:53.711144 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.711127 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:53.802808 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:53.802746 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-182.ec2.internal\" not found" Apr 16 20:11:53.804348 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.804331 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:53.813601 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.813578 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.813696 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.813580 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:53.815351 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.815320 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:06:52 +0000 UTC" deadline="2027-09-19 02:25:52.904408439 +0000 UTC" Apr 16 20:11:53.815351 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.815348 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12486h13m59.089062776s" Apr 16 20:11:53.827680 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.827655 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:53.833497 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.833479 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:53.835236 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.835185 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-182.ec2.internal" Apr 16 20:11:53.844776 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.844760 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:53.852988 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.852966 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wmnzt" Apr 16 20:11:53.859815 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.859582 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wmnzt" Apr 16 20:11:53.920779 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:53.920732 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a31b11c9cf44c8a66bd679136b463d4.slice/crio-8930849994d3c3123afe99728258a60c5082e44739bf453171e2d939fc816b0b WatchSource:0}: Error finding container 8930849994d3c3123afe99728258a60c5082e44739bf453171e2d939fc816b0b: Status 404 returned error can't find the container with id 8930849994d3c3123afe99728258a60c5082e44739bf453171e2d939fc816b0b Apr 16 20:11:53.921175 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:53.921144 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb83001bd5a05036cf4903854c418418b.slice/crio-91e2f79ec1a83c84018fd60a0742d282e27516dfaf5dcdae5c06c053a6fed632 WatchSource:0}: Error finding container 91e2f79ec1a83c84018fd60a0742d282e27516dfaf5dcdae5c06c053a6fed632: Status 404 returned error can't find the container with id 91e2f79ec1a83c84018fd60a0742d282e27516dfaf5dcdae5c06c053a6fed632 Apr 16 20:11:53.924555 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.924536 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:11:53.948461 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.948417 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal" event={"ID":"b83001bd5a05036cf4903854c418418b","Type":"ContainerStarted","Data":"91e2f79ec1a83c84018fd60a0742d282e27516dfaf5dcdae5c06c053a6fed632"} Apr 16 20:11:53.949264 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:53.949237 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-182.ec2.internal" event={"ID":"2a31b11c9cf44c8a66bd679136b463d4","Type":"ContainerStarted","Data":"8930849994d3c3123afe99728258a60c5082e44739bf453171e2d939fc816b0b"} Apr 16 20:11:54.188409 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.188295 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:54.367686 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.367522 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:54.623703 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.623628 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:54.779384 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.779348 2569 apiserver.go:52] "Watching apiserver" Apr 16 20:11:54.785809 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.785779 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:11:54.787729 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.787697 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-k66bc","openshift-network-diagnostics/network-check-target-bg95d","openshift-network-operator/iptables-alerter-42vjt","kube-system/kube-apiserver-proxy-ip-10-0-135-182.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx","openshift-dns/node-resolver-zbk8x","openshift-multus/network-metrics-daemon-ck2ww","openshift-ovn-kubernetes/ovnkube-node-kmsm8","kube-system/konnectivity-agent-pdzfv","openshift-cluster-node-tuning-operator/tuned-7sgxp","openshift-image-registry/node-ca-f5hp5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal","openshift-multus/multus-additional-cni-plugins-6zmm9"] Apr 16 20:11:54.789360 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.789339 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pdzfv" Apr 16 20:11:54.790481 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.790452 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:11:54.790579 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:54.790524 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:11:54.791457 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.791437 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-42vjt" Apr 16 20:11:54.792173 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.792051 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:11:54.792287 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.792191 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:11:54.792352 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.792285 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wrh2w\"" Apr 16 20:11:54.793922 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.793902 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6tprs\"" Apr 16 20:11:54.794016 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.793979 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:54.794016 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.793984 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zbk8x" Apr 16 20:11:54.794125 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.794106 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:11:54.794205 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.794179 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:54.796045 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.795596 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.796045 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.795683 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.796184 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.796094 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:11:54.796402 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.796383 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-gwpsm\"" Apr 16 20:11:54.796486 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.796426 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:11:54.796971 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.796953 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.797620 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.797599 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:11:54.797893 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.797872 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:11:54.798074 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.798048 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:11:54.798853 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.798254 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6dh22\"" Apr 16 20:11:54.798853 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.798639 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:11:54.798853 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.798659 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8bb64\"" Apr 16 20:11:54.798853 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.798681 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:11:54.798853 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.798842 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:11:54.799150 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.798965 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:11:54.799205 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.799177 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:11:54.799288 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.799176 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:11:54.799473 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.799366 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:11:54.799473 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.799371 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:11:54.799588 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:54.799479 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:11:54.799588 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.799371 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:11:54.799970 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.799818 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:11:54.799970 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.799897 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-b48jt\"" Apr 16 20:11:54.800633 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.800619 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.802023 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.801916 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:11:54.802023 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.801928 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.802023 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.801946 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f5hp5" Apr 16 20:11:54.802752 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.802732 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:54.803018 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.803002 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:54.803288 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.803255 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zf9lf\"" Apr 16 20:11:54.803964 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.803944 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:11:54.804060 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.804043 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-p9nzq\"" Apr 16 20:11:54.804730 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.804361 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:11:54.804730 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.804392 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:11:54.804730 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.804366 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:11:54.804730 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.804508 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:11:54.804730 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.804523 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vc456\"" Apr 16 20:11:54.814503 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.814485 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:11:54.823011 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.822981 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-run-openvswitch\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.823115 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.823037 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/72d3b5c6-036b-4c05-9113-913e25110e3c-agent-certs\") pod \"konnectivity-agent-pdzfv\" (UID: \"72d3b5c6-036b-4c05-9113-913e25110e3c\") " pod="kube-system/konnectivity-agent-pdzfv" Apr 16 20:11:54.823310 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.823073 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-sysconfig\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.823383 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.823338 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-kubernetes\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.823437 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.823387 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-sysctl-conf\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.823528 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.823510 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs\") pod \"network-metrics-daemon-ck2ww\" (UID: \"f2865cec-958e-49f5-9bd1-57d8fbb3fefc\") " pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:11:54.823597 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.823570 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-log-socket\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.823682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.823662 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.823751 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.823730 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.823897 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.823879 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-os-release\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.823964 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.823920 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-multus-socket-dir-parent\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.823964 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.823958 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-var-lib-kubelet\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.824064 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.823992 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbc2l\" (UniqueName: \"kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l\") pod \"network-check-target-bg95d\" (UID: \"4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f\") " pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:11:54.824064 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824042 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-system-cni-dir\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.824162 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824078 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-host\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.824162 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824105 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-systemd-units\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.824162 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-multus-cni-dir\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.824330 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824232 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.824381 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824326 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-sys-fs\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.824428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824381 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-run-k8s-cni-cncf-io\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.824469 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824447 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-var-lib-cni-bin\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.824502 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824481 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-run-netns\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.824538 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824508 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-var-lib-cni-multus\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.824571 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824540 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk4jr\" (UniqueName: \"kubernetes.io/projected/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-kube-api-access-dk4jr\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.824610 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824569 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-sysctl-d\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.824643 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824633 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c7d8e3b9-d6d9-447c-91b1-b9d4184f699e-iptables-alerter-script\") pod \"iptables-alerter-42vjt\" (UID: \"c7d8e3b9-d6d9-447c-91b1-b9d4184f699e\") " pod="openshift-network-operator/iptables-alerter-42vjt" Apr 16 20:11:54.824681 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824666 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g8k2\" (UniqueName: \"kubernetes.io/projected/c7d8e3b9-d6d9-447c-91b1-b9d4184f699e-kube-api-access-6g8k2\") pod \"iptables-alerter-42vjt\" (UID: \"c7d8e3b9-d6d9-447c-91b1-b9d4184f699e\") " pod="openshift-network-operator/iptables-alerter-42vjt" Apr 16 20:11:54.824715 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824698 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f73138f6-787e-4ee9-b196-b914563cad39-etc-tuned\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.824747 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824726 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-slash\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.824804 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824770 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0cbc952a-810f-46b7-b791-bccdd61ac1b4-ovnkube-script-lib\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.824870 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824824 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-cnibin\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.824910 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824870 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlw9t\" (UniqueName: \"kubernetes.io/projected/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-kube-api-access-mlw9t\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.824944 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824906 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-modprobe-d\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.824944 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824934 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0cbc952a-810f-46b7-b791-bccdd61ac1b4-ovnkube-config\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.825004 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.824978 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-multus-daemon-config\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.825036 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825009 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-run-multus-certs\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.825064 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825043 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f73138f6-787e-4ee9-b196-b914563cad39-tmp\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.825122 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825089 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-socket-dir\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.825185 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825168 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q6dh\" (UniqueName: \"kubernetes.io/projected/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-kube-api-access-8q6dh\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.825280 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825251 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7d8e3b9-d6d9-447c-91b1-b9d4184f699e-host-slash\") pod \"iptables-alerter-42vjt\" (UID: \"c7d8e3b9-d6d9-447c-91b1-b9d4184f699e\") " pod="openshift-network-operator/iptables-alerter-42vjt" Apr 16 20:11:54.825332 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825308 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-run-netns\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.825427 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825397 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.825502 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825454 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-registration-dir\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.825561 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825529 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-system-cni-dir\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.825661 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825643 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-etc-kubernetes\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.825728 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825686 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl297\" (UniqueName: \"kubernetes.io/projected/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-kube-api-access-nl297\") pod \"network-metrics-daemon-ck2ww\" (UID: \"f2865cec-958e-49f5-9bd1-57d8fbb3fefc\") " pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:11:54.825869 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825838 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-run-ovn\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.825927 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825911 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0cbc952a-810f-46b7-b791-bccdd61ac1b4-env-overrides\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.825978 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.825955 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0cbc952a-810f-46b7-b791-bccdd61ac1b4-ovn-node-metrics-cert\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.826029 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.826016 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-cni-binary-copy\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.826119 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.826081 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-cni-binary-copy\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.826172 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.826142 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-var-lib-kubelet\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.826209 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.826197 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fsr6\" (UniqueName: \"kubernetes.io/projected/85487890-a028-49b2-b173-0f3bef2f3039-kube-api-access-7fsr6\") pod \"node-ca-f5hp5\" (UID: \"85487890-a028-49b2-b173-0f3bef2f3039\") " pod="openshift-image-registry/node-ca-f5hp5" Apr 16 20:11:54.826564 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.826542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-run\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.826632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.826616 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq2gl\" (UniqueName: \"kubernetes.io/projected/ee08a763-0e02-4cc9-a7fc-f2422edc681b-kube-api-access-bq2gl\") pod \"node-resolver-zbk8x\" (UID: \"ee08a763-0e02-4cc9-a7fc-f2422edc681b\") " pod="openshift-dns/node-resolver-zbk8x" Apr 16 20:11:54.826682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.826649 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/72d3b5c6-036b-4c05-9113-913e25110e3c-konnectivity-ca\") pod \"konnectivity-agent-pdzfv\" (UID: \"72d3b5c6-036b-4c05-9113-913e25110e3c\") " pod="kube-system/konnectivity-agent-pdzfv" Apr 16 20:11:54.826731 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.826694 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-device-dir\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.826780 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.826728 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-cnibin\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.826780 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.826762 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-etc-openvswitch\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.826876 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.826792 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-node-log\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.826876 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.826815 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-etc-selinux\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.826971 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.826847 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-multus-conf-dir\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.827058 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827017 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-systemd\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.827142 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827128 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-run-systemd\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.827326 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827177 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-var-lib-openvswitch\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.827326 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827310 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-hostroot\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.827465 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827348 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-cni-netd\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.827465 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827378 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85487890-a028-49b2-b173-0f3bef2f3039-host\") pod \"node-ca-f5hp5\" (UID: \"85487890-a028-49b2-b173-0f3bef2f3039\") " pod="openshift-image-registry/node-ca-f5hp5" Apr 16 20:11:54.827465 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827402 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/85487890-a028-49b2-b173-0f3bef2f3039-serviceca\") pod \"node-ca-f5hp5\" (UID: \"85487890-a028-49b2-b173-0f3bef2f3039\") " pod="openshift-image-registry/node-ca-f5hp5" Apr 16 20:11:54.828461 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827661 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-lib-modules\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.828461 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827727 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq95f\" (UniqueName: \"kubernetes.io/projected/f73138f6-787e-4ee9-b196-b914563cad39-kube-api-access-bq95f\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.828461 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827761 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-cni-bin\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.828461 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827805 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.828461 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827839 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-os-release\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.828461 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827868 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee08a763-0e02-4cc9-a7fc-f2422edc681b-hosts-file\") pod \"node-resolver-zbk8x\" (UID: \"ee08a763-0e02-4cc9-a7fc-f2422edc681b\") " pod="openshift-dns/node-resolver-zbk8x" Apr 16 20:11:54.828461 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827897 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqzdh\" (UniqueName: \"kubernetes.io/projected/0cbc952a-810f-46b7-b791-bccdd61ac1b4-kube-api-access-lqzdh\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.828461 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.827923 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.828461 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.828049 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-sys\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.828461 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.828148 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee08a763-0e02-4cc9-a7fc-f2422edc681b-tmp-dir\") pod \"node-resolver-zbk8x\" (UID: \"ee08a763-0e02-4cc9-a7fc-f2422edc681b\") " pod="openshift-dns/node-resolver-zbk8x" Apr 16 20:11:54.828461 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.828299 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-kubelet\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.860550 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.860518 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:53 +0000 UTC" deadline="2028-01-18 19:52:43.790308604 +0000 UTC" Apr 16 20:11:54.860628 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.860552 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15407h40m48.92975961s" Apr 16 20:11:54.929726 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929652 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-device-dir\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.929726 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929682 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-cnibin\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.929726 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929702 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-etc-openvswitch\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.929969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929726 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-node-log\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.929969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929761 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-etc-selinux\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.929969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929770 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-device-dir\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.929969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929785 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-etc-openvswitch\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.929969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929778 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-cnibin\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.929969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929784 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-multus-conf-dir\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.929969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929839 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-systemd\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.929969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929843 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-etc-selinux\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.929969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929841 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-multus-conf-dir\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.929969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929844 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-node-log\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.929969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929911 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-systemd\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.929969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929949 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-run-systemd\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.929969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929967 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-var-lib-openvswitch\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929981 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-hostroot\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.929999 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-cni-netd\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930008 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-run-systemd\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930021 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85487890-a028-49b2-b173-0f3bef2f3039-host\") pod \"node-ca-f5hp5\" (UID: \"85487890-a028-49b2-b173-0f3bef2f3039\") " pod="openshift-image-registry/node-ca-f5hp5" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930022 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-var-lib-openvswitch\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930038 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-hostroot\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930051 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-cni-netd\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930075 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85487890-a028-49b2-b173-0f3bef2f3039-host\") pod \"node-ca-f5hp5\" (UID: \"85487890-a028-49b2-b173-0f3bef2f3039\") " pod="openshift-image-registry/node-ca-f5hp5" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930110 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/85487890-a028-49b2-b173-0f3bef2f3039-serviceca\") pod \"node-ca-f5hp5\" (UID: \"85487890-a028-49b2-b173-0f3bef2f3039\") " pod="openshift-image-registry/node-ca-f5hp5" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930129 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-lib-modules\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930248 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq95f\" (UniqueName: \"kubernetes.io/projected/f73138f6-787e-4ee9-b196-b914563cad39-kube-api-access-bq95f\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930263 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-lib-modules\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930278 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-cni-bin\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930333 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-os-release\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930359 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee08a763-0e02-4cc9-a7fc-f2422edc681b-hosts-file\") pod \"node-resolver-zbk8x\" (UID: \"ee08a763-0e02-4cc9-a7fc-f2422edc681b\") " pod="openshift-dns/node-resolver-zbk8x" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqzdh\" (UniqueName: \"kubernetes.io/projected/0cbc952a-810f-46b7-b791-bccdd61ac1b4-kube-api-access-lqzdh\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.930586 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930442 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-sys\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930465 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee08a763-0e02-4cc9-a7fc-f2422edc681b-tmp-dir\") pod \"node-resolver-zbk8x\" (UID: \"ee08a763-0e02-4cc9-a7fc-f2422edc681b\") " pod="openshift-dns/node-resolver-zbk8x" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930490 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-kubelet\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930515 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-run-openvswitch\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930540 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/72d3b5c6-036b-4c05-9113-913e25110e3c-agent-certs\") pod \"konnectivity-agent-pdzfv\" (UID: \"72d3b5c6-036b-4c05-9113-913e25110e3c\") " pod="kube-system/konnectivity-agent-pdzfv" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930564 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-sysconfig\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930582 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/85487890-a028-49b2-b173-0f3bef2f3039-serviceca\") pod \"node-ca-f5hp5\" (UID: \"85487890-a028-49b2-b173-0f3bef2f3039\") " pod="openshift-image-registry/node-ca-f5hp5" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930590 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-kubernetes\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930619 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-cni-bin\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930617 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-sysctl-conf\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930660 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs\") pod \"network-metrics-daemon-ck2ww\" (UID: \"f2865cec-958e-49f5-9bd1-57d8fbb3fefc\") " pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930669 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-run-openvswitch\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930678 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-log-socket\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930696 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930729 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-os-release\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930744 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-multus-socket-dir-parent\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.931428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930760 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-var-lib-kubelet\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930785 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbc2l\" (UniqueName: \"kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l\") pod \"network-check-target-bg95d\" (UID: \"4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f\") " pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930796 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-sysctl-conf\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930811 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-system-cni-dir\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930851 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-system-cni-dir\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930885 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930882 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-sys\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930936 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930971 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ee08a763-0e02-4cc9-a7fc-f2422edc681b-hosts-file\") pod \"node-resolver-zbk8x\" (UID: \"ee08a763-0e02-4cc9-a7fc-f2422edc681b\") " pod="openshift-dns/node-resolver-zbk8x" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930956 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-os-release\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930993 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-os-release\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:54.931149 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:54.931263 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs podName:f2865cec-958e-49f5-9bd1-57d8fbb3fefc nodeName:}" failed. No retries permitted until 2026-04-16 20:11:55.431206686 +0000 UTC m=+3.092804157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs") pod "network-metrics-daemon-ck2ww" (UID: "f2865cec-958e-49f5-9bd1-57d8fbb3fefc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931276 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee08a763-0e02-4cc9-a7fc-f2422edc681b-tmp-dir\") pod \"node-resolver-zbk8x\" (UID: \"ee08a763-0e02-4cc9-a7fc-f2422edc681b\") " pod="openshift-dns/node-resolver-zbk8x" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931311 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-log-socket\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931324 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931329 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-kubernetes\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.932254 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931337 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-host\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931363 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-multus-socket-dir-parent\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931369 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-kubelet\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931372 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-systemd-units\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931447 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-systemd-units\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931443 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-sysconfig\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931492 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-host\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931511 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-multus-cni-dir\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931543 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.930947 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-var-lib-kubelet\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931574 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-sys-fs\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931581 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-multus-cni-dir\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931612 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931601 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-run-k8s-cni-cncf-io\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931634 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-sys-fs\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931651 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-run-k8s-cni-cncf-io\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931653 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-var-lib-cni-bin\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931684 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-var-lib-cni-bin\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.933026 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931690 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-run-netns\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931765 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-var-lib-cni-multus\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931772 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-run-netns\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931788 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk4jr\" (UniqueName: \"kubernetes.io/projected/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-kube-api-access-dk4jr\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931803 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931807 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-sysctl-d\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931830 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-var-lib-cni-multus\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931830 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931841 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c7d8e3b9-d6d9-447c-91b1-b9d4184f699e-iptables-alerter-script\") pod \"iptables-alerter-42vjt\" (UID: \"c7d8e3b9-d6d9-447c-91b1-b9d4184f699e\") " pod="openshift-network-operator/iptables-alerter-42vjt" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931908 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-sysctl-d\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931908 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g8k2\" (UniqueName: \"kubernetes.io/projected/c7d8e3b9-d6d9-447c-91b1-b9d4184f699e-kube-api-access-6g8k2\") pod \"iptables-alerter-42vjt\" (UID: \"c7d8e3b9-d6d9-447c-91b1-b9d4184f699e\") " pod="openshift-network-operator/iptables-alerter-42vjt" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931938 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f73138f6-787e-4ee9-b196-b914563cad39-etc-tuned\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.931981 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-slash\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932010 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-slash\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932016 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0cbc952a-810f-46b7-b791-bccdd61ac1b4-ovnkube-script-lib\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-cnibin\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932090 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlw9t\" (UniqueName: \"kubernetes.io/projected/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-kube-api-access-mlw9t\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.933891 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932118 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-modprobe-d\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932140 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-cnibin\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0cbc952a-810f-46b7-b791-bccdd61ac1b4-ovnkube-config\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932187 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-multus-daemon-config\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932236 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-run-multus-certs\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932264 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f73138f6-787e-4ee9-b196-b914563cad39-tmp\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932280 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c7d8e3b9-d6d9-447c-91b1-b9d4184f699e-iptables-alerter-script\") pod \"iptables-alerter-42vjt\" (UID: \"c7d8e3b9-d6d9-447c-91b1-b9d4184f699e\") " pod="openshift-network-operator/iptables-alerter-42vjt" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932303 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-socket-dir\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932333 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q6dh\" (UniqueName: \"kubernetes.io/projected/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-kube-api-access-8q6dh\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932358 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7d8e3b9-d6d9-447c-91b1-b9d4184f699e-host-slash\") pod \"iptables-alerter-42vjt\" (UID: \"c7d8e3b9-d6d9-447c-91b1-b9d4184f699e\") " pod="openshift-network-operator/iptables-alerter-42vjt" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-run-netns\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932395 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-etc-modprobe-d\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932431 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-registration-dir\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932464 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-host-run-multus-certs\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932485 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-system-cni-dir\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932522 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c7d8e3b9-d6d9-447c-91b1-b9d4184f699e-host-slash\") pod \"iptables-alerter-42vjt\" (UID: \"c7d8e3b9-d6d9-447c-91b1-b9d4184f699e\") " pod="openshift-network-operator/iptables-alerter-42vjt" Apr 16 20:11:54.934682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932524 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-etc-kubernetes\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932567 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-etc-kubernetes\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932596 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nl297\" (UniqueName: \"kubernetes.io/projected/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-kube-api-access-nl297\") pod \"network-metrics-daemon-ck2ww\" (UID: \"f2865cec-958e-49f5-9bd1-57d8fbb3fefc\") " pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932622 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-run-ovn\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-host-run-netns\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932647 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0cbc952a-810f-46b7-b791-bccdd61ac1b4-env-overrides\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932674 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0cbc952a-810f-46b7-b791-bccdd61ac1b4-ovn-node-metrics-cert\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932701 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-cni-binary-copy\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932725 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-cni-binary-copy\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932764 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-var-lib-kubelet\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932782 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fsr6\" (UniqueName: \"kubernetes.io/projected/85487890-a028-49b2-b173-0f3bef2f3039-kube-api-access-7fsr6\") pod \"node-ca-f5hp5\" (UID: \"85487890-a028-49b2-b173-0f3bef2f3039\") " pod="openshift-image-registry/node-ca-f5hp5" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932829 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0cbc952a-810f-46b7-b791-bccdd61ac1b4-ovnkube-config\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932867 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-run\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932893 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq2gl\" (UniqueName: \"kubernetes.io/projected/ee08a763-0e02-4cc9-a7fc-f2422edc681b-kube-api-access-bq2gl\") pod \"node-resolver-zbk8x\" (UID: \"ee08a763-0e02-4cc9-a7fc-f2422edc681b\") " pod="openshift-dns/node-resolver-zbk8x" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932912 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-system-cni-dir\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932939 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/72d3b5c6-036b-4c05-9113-913e25110e3c-konnectivity-ca\") pod \"konnectivity-agent-pdzfv\" (UID: \"72d3b5c6-036b-4c05-9113-913e25110e3c\") " pod="kube-system/konnectivity-agent-pdzfv" Apr 16 20:11:54.935523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932978 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-registration-dir\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.936380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932985 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-var-lib-kubelet\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.936380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.933047 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0cbc952a-810f-46b7-b791-bccdd61ac1b4-run-ovn\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.936380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.932647 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0cbc952a-810f-46b7-b791-bccdd61ac1b4-ovnkube-script-lib\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.936380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.933070 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-socket-dir\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:54.936380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.933088 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f73138f6-787e-4ee9-b196-b914563cad39-run\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.936380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.933592 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/72d3b5c6-036b-4c05-9113-913e25110e3c-konnectivity-ca\") pod \"konnectivity-agent-pdzfv\" (UID: \"72d3b5c6-036b-4c05-9113-913e25110e3c\") " pod="kube-system/konnectivity-agent-pdzfv" Apr 16 20:11:54.936380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.933606 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-cni-binary-copy\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.936380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.933590 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-cni-binary-copy\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.936380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.933749 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0cbc952a-810f-46b7-b791-bccdd61ac1b4-env-overrides\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.936380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.933790 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-multus-daemon-config\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.936380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.935540 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/72d3b5c6-036b-4c05-9113-913e25110e3c-agent-certs\") pod \"konnectivity-agent-pdzfv\" (UID: \"72d3b5c6-036b-4c05-9113-913e25110e3c\") " pod="kube-system/konnectivity-agent-pdzfv" Apr 16 20:11:54.936380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.936331 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f73138f6-787e-4ee9-b196-b914563cad39-etc-tuned\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.936380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.936356 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0cbc952a-810f-46b7-b791-bccdd61ac1b4-ovn-node-metrics-cert\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.937658 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.937615 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f73138f6-787e-4ee9-b196-b914563cad39-tmp\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.941162 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.941133 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq95f\" (UniqueName: \"kubernetes.io/projected/f73138f6-787e-4ee9-b196-b914563cad39-kube-api-access-bq95f\") pod \"tuned-7sgxp\" (UID: \"f73138f6-787e-4ee9-b196-b914563cad39\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:54.941575 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:54.941553 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:54.941667 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:54.941581 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:54.941667 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:54.941594 2569 projected.go:194] Error preparing data for projected volume kube-api-access-qbc2l for pod openshift-network-diagnostics/network-check-target-bg95d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:54.941667 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:54.941653 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l podName:4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f nodeName:}" failed. No retries permitted until 2026-04-16 20:11:55.441634002 +0000 UTC m=+3.103231472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qbc2l" (UniqueName: "kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l") pod "network-check-target-bg95d" (UID: "4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:54.942199 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.942174 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fsr6\" (UniqueName: \"kubernetes.io/projected/85487890-a028-49b2-b173-0f3bef2f3039-kube-api-access-7fsr6\") pod \"node-ca-f5hp5\" (UID: \"85487890-a028-49b2-b173-0f3bef2f3039\") " pod="openshift-image-registry/node-ca-f5hp5" Apr 16 20:11:54.943032 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.943012 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk4jr\" (UniqueName: \"kubernetes.io/projected/b37ad374-9da6-4bb9-ab54-e87d0ccf8712-kube-api-access-dk4jr\") pod \"multus-k66bc\" (UID: \"b37ad374-9da6-4bb9-ab54-e87d0ccf8712\") " pod="openshift-multus/multus-k66bc" Apr 16 20:11:54.944007 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.943953 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq2gl\" (UniqueName: \"kubernetes.io/projected/ee08a763-0e02-4cc9-a7fc-f2422edc681b-kube-api-access-bq2gl\") pod \"node-resolver-zbk8x\" (UID: \"ee08a763-0e02-4cc9-a7fc-f2422edc681b\") " pod="openshift-dns/node-resolver-zbk8x" Apr 16 20:11:54.944201 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.944149 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g8k2\" (UniqueName: \"kubernetes.io/projected/c7d8e3b9-d6d9-447c-91b1-b9d4184f699e-kube-api-access-6g8k2\") pod \"iptables-alerter-42vjt\" (UID: \"c7d8e3b9-d6d9-447c-91b1-b9d4184f699e\") " pod="openshift-network-operator/iptables-alerter-42vjt" Apr 16 20:11:54.944602 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.944576 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqzdh\" (UniqueName: \"kubernetes.io/projected/0cbc952a-810f-46b7-b791-bccdd61ac1b4-kube-api-access-lqzdh\") pod \"ovnkube-node-kmsm8\" (UID: \"0cbc952a-810f-46b7-b791-bccdd61ac1b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:54.944926 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.944907 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlw9t\" (UniqueName: \"kubernetes.io/projected/01eeb0f7-ee5b-44af-ab8f-b3296ae5b886-kube-api-access-mlw9t\") pod \"multus-additional-cni-plugins-6zmm9\" (UID: \"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886\") " pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:54.945521 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.945410 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl297\" (UniqueName: \"kubernetes.io/projected/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-kube-api-access-nl297\") pod \"network-metrics-daemon-ck2ww\" (UID: \"f2865cec-958e-49f5-9bd1-57d8fbb3fefc\") " pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:11:54.945765 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:54.945740 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q6dh\" (UniqueName: \"kubernetes.io/projected/4fc1e536-079c-4bb9-9eb5-6e211948cd4f-kube-api-access-8q6dh\") pod \"aws-ebs-csi-driver-node-cgsnx\" (UID: \"4fc1e536-079c-4bb9-9eb5-6e211948cd4f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:55.103405 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.103366 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pdzfv" Apr 16 20:11:55.112160 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.112136 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-42vjt" Apr 16 20:11:55.127890 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.127869 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zbk8x" Apr 16 20:11:55.134501 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.134483 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" Apr 16 20:11:55.142101 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.142074 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k66bc" Apr 16 20:11:55.149772 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.149743 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:11:55.156317 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.156293 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" Apr 16 20:11:55.164911 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.164889 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6zmm9" Apr 16 20:11:55.169975 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.169953 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f5hp5" Apr 16 20:11:55.435192 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.435163 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs\") pod \"network-metrics-daemon-ck2ww\" (UID: \"f2865cec-958e-49f5-9bd1-57d8fbb3fefc\") " pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:11:55.435367 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:55.435322 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:55.435451 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:55.435409 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs podName:f2865cec-958e-49f5-9bd1-57d8fbb3fefc nodeName:}" failed. No retries permitted until 2026-04-16 20:11:56.435389209 +0000 UTC m=+4.096986675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs") pod "network-metrics-daemon-ck2ww" (UID: "f2865cec-958e-49f5-9bd1-57d8fbb3fefc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:55.519319 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:55.519289 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee08a763_0e02_4cc9_a7fc_f2422edc681b.slice/crio-a1633242990bd2e05bdbb15357753f7341ebb757d018415340ba4e8149227c30 WatchSource:0}: Error finding container a1633242990bd2e05bdbb15357753f7341ebb757d018415340ba4e8149227c30: Status 404 returned error can't find the container with id a1633242990bd2e05bdbb15357753f7341ebb757d018415340ba4e8149227c30 Apr 16 20:11:55.520382 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:55.520355 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cbc952a_810f_46b7_b791_bccdd61ac1b4.slice/crio-897aa52d2a130fee102b374b1ffce2b169667237c059cda40410c218ac6a5367 WatchSource:0}: Error finding container 897aa52d2a130fee102b374b1ffce2b169667237c059cda40410c218ac6a5367: Status 404 returned error can't find the container with id 897aa52d2a130fee102b374b1ffce2b169667237c059cda40410c218ac6a5367 Apr 16 20:11:55.521299 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:55.521082 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d8e3b9_d6d9_447c_91b1_b9d4184f699e.slice/crio-eeb1e5025664338adfb73fd47666660476346b2cc1af88a368d570782812efcd WatchSource:0}: Error finding container eeb1e5025664338adfb73fd47666660476346b2cc1af88a368d570782812efcd: Status 404 returned error can't find the container with id eeb1e5025664338adfb73fd47666660476346b2cc1af88a368d570782812efcd Apr 16 20:11:55.521861 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:55.521841 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01eeb0f7_ee5b_44af_ab8f_b3296ae5b886.slice/crio-5b324cbced029f9be79efc49b97d9998f9e3b89c2938a686e2332b07c2923b5e WatchSource:0}: Error finding container 5b324cbced029f9be79efc49b97d9998f9e3b89c2938a686e2332b07c2923b5e: Status 404 returned error can't find the container with id 5b324cbced029f9be79efc49b97d9998f9e3b89c2938a686e2332b07c2923b5e Apr 16 20:11:55.523958 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:55.523935 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85487890_a028_49b2_b173_0f3bef2f3039.slice/crio-58a630b42129e823c05e24b10f53b70d5f46d8e6ded5278bd5d939c2a45eaa92 WatchSource:0}: Error finding container 58a630b42129e823c05e24b10f53b70d5f46d8e6ded5278bd5d939c2a45eaa92: Status 404 returned error can't find the container with id 58a630b42129e823c05e24b10f53b70d5f46d8e6ded5278bd5d939c2a45eaa92 Apr 16 20:11:55.525164 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:55.525140 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fc1e536_079c_4bb9_9eb5_6e211948cd4f.slice/crio-1b7b389c23088618f78082697ef9ef6eed134164469d8156bc2b3b69a2be9804 WatchSource:0}: Error finding container 1b7b389c23088618f78082697ef9ef6eed134164469d8156bc2b3b69a2be9804: Status 404 returned error can't find the container with id 1b7b389c23088618f78082697ef9ef6eed134164469d8156bc2b3b69a2be9804 Apr 16 20:11:55.526453 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:55.526402 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72d3b5c6_036b_4c05_9113_913e25110e3c.slice/crio-100a11eefcf71b1ccfc6dd594be3d9e811ccf76fe8284d2440d6e8273419c770 WatchSource:0}: Error finding container 100a11eefcf71b1ccfc6dd594be3d9e811ccf76fe8284d2440d6e8273419c770: Status 404 returned error can't find the container with id 100a11eefcf71b1ccfc6dd594be3d9e811ccf76fe8284d2440d6e8273419c770 Apr 16 20:11:55.527630 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:11:55.527447 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb37ad374_9da6_4bb9_ab54_e87d0ccf8712.slice/crio-cfed56fcb2877835cf002c2008fe9295fa970ed929381264e2dc2e2e440a9312 WatchSource:0}: Error finding container cfed56fcb2877835cf002c2008fe9295fa970ed929381264e2dc2e2e440a9312: Status 404 returned error can't find the container with id cfed56fcb2877835cf002c2008fe9295fa970ed929381264e2dc2e2e440a9312 Apr 16 20:11:55.536086 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.536065 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbc2l\" (UniqueName: \"kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l\") pod \"network-check-target-bg95d\" (UID: \"4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f\") " pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:11:55.536249 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:55.536230 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:55.536320 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:55.536254 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:55.536320 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:55.536264 2569 projected.go:194] Error preparing data for projected volume kube-api-access-qbc2l for pod openshift-network-diagnostics/network-check-target-bg95d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:55.536320 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:55.536316 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l podName:4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f nodeName:}" failed. No retries permitted until 2026-04-16 20:11:56.536297766 +0000 UTC m=+4.197895239 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qbc2l" (UniqueName: "kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l") pod "network-check-target-bg95d" (UID: "4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:55.861114 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.861033 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:53 +0000 UTC" deadline="2028-01-20 00:31:09.805252604 +0000 UTC" Apr 16 20:11:55.861114 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.861065 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15436h19m13.944191633s" Apr 16 20:11:55.945539 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.945505 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:11:55.945722 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:55.945655 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:11:55.957265 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.957204 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-42vjt" event={"ID":"c7d8e3b9-d6d9-447c-91b1-b9d4184f699e","Type":"ContainerStarted","Data":"eeb1e5025664338adfb73fd47666660476346b2cc1af88a368d570782812efcd"} Apr 16 20:11:55.962809 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.962778 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-182.ec2.internal" event={"ID":"2a31b11c9cf44c8a66bd679136b463d4","Type":"ContainerStarted","Data":"c00991f3184107d98f78213999a62913198c9d5a4a7170dcbbad84258ada3931"} Apr 16 20:11:55.964199 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.964164 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" event={"ID":"f73138f6-787e-4ee9-b196-b914563cad39","Type":"ContainerStarted","Data":"8a34f316bd1ddec636e63b190458cc5a35a99d9e02db4c878682363daffeac39"} Apr 16 20:11:55.968030 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.967995 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pdzfv" event={"ID":"72d3b5c6-036b-4c05-9113-913e25110e3c","Type":"ContainerStarted","Data":"100a11eefcf71b1ccfc6dd594be3d9e811ccf76fe8284d2440d6e8273419c770"} Apr 16 20:11:55.969508 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.969481 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" event={"ID":"4fc1e536-079c-4bb9-9eb5-6e211948cd4f","Type":"ContainerStarted","Data":"1b7b389c23088618f78082697ef9ef6eed134164469d8156bc2b3b69a2be9804"} Apr 16 20:11:55.971175 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.971133 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zmm9" event={"ID":"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886","Type":"ContainerStarted","Data":"5b324cbced029f9be79efc49b97d9998f9e3b89c2938a686e2332b07c2923b5e"} Apr 16 20:11:55.973040 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.973016 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" event={"ID":"0cbc952a-810f-46b7-b791-bccdd61ac1b4","Type":"ContainerStarted","Data":"897aa52d2a130fee102b374b1ffce2b169667237c059cda40410c218ac6a5367"} Apr 16 20:11:55.974938 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.974887 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zbk8x" event={"ID":"ee08a763-0e02-4cc9-a7fc-f2422edc681b","Type":"ContainerStarted","Data":"a1633242990bd2e05bdbb15357753f7341ebb757d018415340ba4e8149227c30"} Apr 16 20:11:55.978578 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.978550 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k66bc" event={"ID":"b37ad374-9da6-4bb9-ab54-e87d0ccf8712","Type":"ContainerStarted","Data":"cfed56fcb2877835cf002c2008fe9295fa970ed929381264e2dc2e2e440a9312"} Apr 16 20:11:55.980494 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.980418 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-182.ec2.internal" podStartSLOduration=2.980377484 podStartE2EDuration="2.980377484s" podCreationTimestamp="2026-04-16 20:11:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:55.979733756 +0000 UTC m=+3.641331249" watchObservedRunningTime="2026-04-16 20:11:55.980377484 +0000 UTC m=+3.641974977" Apr 16 20:11:55.981690 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:55.981660 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f5hp5" event={"ID":"85487890-a028-49b2-b173-0f3bef2f3039","Type":"ContainerStarted","Data":"58a630b42129e823c05e24b10f53b70d5f46d8e6ded5278bd5d939c2a45eaa92"} Apr 16 20:11:56.444535 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:56.444497 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs\") pod \"network-metrics-daemon-ck2ww\" (UID: \"f2865cec-958e-49f5-9bd1-57d8fbb3fefc\") " pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:11:56.444699 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:56.444655 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:56.444778 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:56.444721 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs podName:f2865cec-958e-49f5-9bd1-57d8fbb3fefc nodeName:}" failed. No retries permitted until 2026-04-16 20:11:58.444702883 +0000 UTC m=+6.106300367 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs") pod "network-metrics-daemon-ck2ww" (UID: "f2865cec-958e-49f5-9bd1-57d8fbb3fefc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:56.545645 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:56.545605 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbc2l\" (UniqueName: \"kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l\") pod \"network-check-target-bg95d\" (UID: \"4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f\") " pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:11:56.545826 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:56.545771 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:56.545826 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:56.545791 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:56.545826 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:56.545804 2569 projected.go:194] Error preparing data for projected volume kube-api-access-qbc2l for pod openshift-network-diagnostics/network-check-target-bg95d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:56.546003 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:56.545863 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l podName:4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f nodeName:}" failed. No retries permitted until 2026-04-16 20:11:58.545844524 +0000 UTC m=+6.207441994 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qbc2l" (UniqueName: "kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l") pod "network-check-target-bg95d" (UID: "4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:56.948102 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:56.948070 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:11:56.948555 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:56.948201 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:11:57.004151 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:57.004113 2569 generic.go:358] "Generic (PLEG): container finished" podID="b83001bd5a05036cf4903854c418418b" containerID="044214883d58abd4584e6b146ac3d866cbc71ea5ecbc10bc38a97220f8ff708e" exitCode=0 Apr 16 20:11:57.004320 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:57.004277 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal" event={"ID":"b83001bd5a05036cf4903854c418418b","Type":"ContainerDied","Data":"044214883d58abd4584e6b146ac3d866cbc71ea5ecbc10bc38a97220f8ff708e"} Apr 16 20:11:57.946028 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:57.945994 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:11:57.946253 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:57.946146 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:11:58.020022 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:58.019985 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal" event={"ID":"b83001bd5a05036cf4903854c418418b","Type":"ContainerStarted","Data":"dea27cfdb0fed37c5715554ca38c03ce3c821c19adf60e14f89b4400108a0860"} Apr 16 20:11:58.043869 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:58.043810 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-182.ec2.internal" podStartSLOduration=5.043789089 podStartE2EDuration="5.043789089s" podCreationTimestamp="2026-04-16 20:11:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:58.043432242 +0000 UTC m=+5.705029732" watchObservedRunningTime="2026-04-16 20:11:58.043789089 +0000 UTC m=+5.705386582" Apr 16 20:11:58.468475 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:58.468435 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs\") pod \"network-metrics-daemon-ck2ww\" (UID: \"f2865cec-958e-49f5-9bd1-57d8fbb3fefc\") " pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:11:58.468651 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:58.468617 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:58.468717 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:58.468686 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs podName:f2865cec-958e-49f5-9bd1-57d8fbb3fefc nodeName:}" failed. No retries permitted until 2026-04-16 20:12:02.468666214 +0000 UTC m=+10.130263681 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs") pod "network-metrics-daemon-ck2ww" (UID: "f2865cec-958e-49f5-9bd1-57d8fbb3fefc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:58.569905 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:58.569865 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbc2l\" (UniqueName: \"kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l\") pod \"network-check-target-bg95d\" (UID: \"4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f\") " pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:11:58.570065 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:58.570023 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:58.570065 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:58.570041 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:58.570065 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:58.570054 2569 projected.go:194] Error preparing data for projected volume kube-api-access-qbc2l for pod openshift-network-diagnostics/network-check-target-bg95d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:58.570254 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:58.570122 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l podName:4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f nodeName:}" failed. No retries permitted until 2026-04-16 20:12:02.570102295 +0000 UTC m=+10.231699766 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qbc2l" (UniqueName: "kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l") pod "network-check-target-bg95d" (UID: "4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:58.945834 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:58.945503 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:11:58.945834 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:58.945645 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:11:59.945605 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:11:59.945131 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:11:59.945605 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:11:59.945314 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:00.946013 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:00.945978 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:00.946457 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:00.946125 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:01.945747 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:01.945704 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:01.945907 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:01.945847 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:02.501724 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:02.501686 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs\") pod \"network-metrics-daemon-ck2ww\" (UID: \"f2865cec-958e-49f5-9bd1-57d8fbb3fefc\") " pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:02.502303 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:02.501857 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:02.502303 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:02.501935 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs podName:f2865cec-958e-49f5-9bd1-57d8fbb3fefc nodeName:}" failed. No retries permitted until 2026-04-16 20:12:10.501913913 +0000 UTC m=+18.163511397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs") pod "network-metrics-daemon-ck2ww" (UID: "f2865cec-958e-49f5-9bd1-57d8fbb3fefc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:02.603028 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:02.602942 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbc2l\" (UniqueName: \"kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l\") pod \"network-check-target-bg95d\" (UID: \"4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f\") " pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:02.603184 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:02.603099 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:02.603184 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:02.603126 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:02.603184 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:02.603139 2569 projected.go:194] Error preparing data for projected volume kube-api-access-qbc2l for pod openshift-network-diagnostics/network-check-target-bg95d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:02.603378 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:02.603228 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l podName:4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f nodeName:}" failed. No retries permitted until 2026-04-16 20:12:10.603185286 +0000 UTC m=+18.264782756 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qbc2l" (UniqueName: "kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l") pod "network-check-target-bg95d" (UID: "4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:02.947618 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:02.946976 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:02.947618 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:02.947086 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:03.945884 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:03.945634 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:03.946355 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:03.946021 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:04.945905 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:04.945867 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:04.946316 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:04.946009 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:05.945746 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:05.945703 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:05.945916 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:05.945837 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:06.945973 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:06.945940 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:06.946440 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:06.946047 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:07.945608 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:07.945578 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:07.945756 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:07.945726 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:08.945761 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:08.945723 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:08.946226 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:08.945863 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:09.945192 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:09.945158 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:09.945493 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:09.945303 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:10.561515 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:10.561469 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs\") pod \"network-metrics-daemon-ck2ww\" (UID: \"f2865cec-958e-49f5-9bd1-57d8fbb3fefc\") " pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:10.562037 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:10.561664 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:10.562037 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:10.561768 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs podName:f2865cec-958e-49f5-9bd1-57d8fbb3fefc nodeName:}" failed. No retries permitted until 2026-04-16 20:12:26.561744003 +0000 UTC m=+34.223341486 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs") pod "network-metrics-daemon-ck2ww" (UID: "f2865cec-958e-49f5-9bd1-57d8fbb3fefc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:10.662391 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:10.662348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbc2l\" (UniqueName: \"kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l\") pod \"network-check-target-bg95d\" (UID: \"4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f\") " pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:10.662579 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:10.662535 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:10.662579 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:10.662563 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:10.662579 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:10.662576 2569 projected.go:194] Error preparing data for projected volume kube-api-access-qbc2l for pod openshift-network-diagnostics/network-check-target-bg95d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:10.662729 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:10.662644 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l podName:4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f nodeName:}" failed. No retries permitted until 2026-04-16 20:12:26.662625053 +0000 UTC m=+34.324222538 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qbc2l" (UniqueName: "kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l") pod "network-check-target-bg95d" (UID: "4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:10.945466 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:10.945390 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:10.945614 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:10.945548 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:11.945359 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:11.945326 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:11.945797 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:11.945452 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:12.946642 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:12.946617 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:12.947209 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:12.946754 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:13.047851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.047823 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zmm9" event={"ID":"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886","Type":"ContainerStarted","Data":"0d98395ce90df7edb3158189b57f2347a30e5ba8e964a7058b574344c1111bc1"} Apr 16 20:12:13.049679 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.049637 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" event={"ID":"0cbc952a-810f-46b7-b791-bccdd61ac1b4","Type":"ContainerStarted","Data":"7eff5f3e7b2ae457fff38cd71f7f01d93902228285575b5e77e883b1b5255511"} Apr 16 20:12:13.049771 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.049685 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" event={"ID":"0cbc952a-810f-46b7-b791-bccdd61ac1b4","Type":"ContainerStarted","Data":"b350fec7969bd053218d18e4fd71d27c0d3b31126005eeed6a31f025289dfe95"} Apr 16 20:12:13.050935 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.050911 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zbk8x" event={"ID":"ee08a763-0e02-4cc9-a7fc-f2422edc681b","Type":"ContainerStarted","Data":"a9bbedd5a0926b6c352a93c1c570bc7afe5d9febb837ec381740b1956d166651"} Apr 16 20:12:13.052200 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.052177 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k66bc" event={"ID":"b37ad374-9da6-4bb9-ab54-e87d0ccf8712","Type":"ContainerStarted","Data":"0aa5d188a0db7e615043487a89211dd6338de489f61fbc1c1fd3d3b2d7a4d0ed"} Apr 16 20:12:13.053885 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.053534 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f5hp5" event={"ID":"85487890-a028-49b2-b173-0f3bef2f3039","Type":"ContainerStarted","Data":"a54545b65be14bea73122bc111522814e9e896574ca23881594c3d850a50031d"} Apr 16 20:12:13.054873 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.054853 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" event={"ID":"f73138f6-787e-4ee9-b196-b914563cad39","Type":"ContainerStarted","Data":"0d8633cdd1c985385006065e9c56942a06010b82fd1927bc3db94e1a2861dddc"} Apr 16 20:12:13.056361 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.056343 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pdzfv" event={"ID":"72d3b5c6-036b-4c05-9113-913e25110e3c","Type":"ContainerStarted","Data":"4abef849273366aacc4fff7983f49f2c7d1a262ce097ab419ab715e7bdd37661"} Apr 16 20:12:13.057472 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.057426 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" event={"ID":"4fc1e536-079c-4bb9-9eb5-6e211948cd4f","Type":"ContainerStarted","Data":"a3f43a89c6d28982586f6dea264b4acc46a7f5b96a8576f840803689e2a4334b"} Apr 16 20:12:13.087502 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.087406 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zbk8x" podStartSLOduration=2.986017806 podStartE2EDuration="20.087388436s" podCreationTimestamp="2026-04-16 20:11:53 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.520834188 +0000 UTC m=+3.182431669" lastFinishedPulling="2026-04-16 20:12:12.622204832 +0000 UTC m=+20.283802299" observedRunningTime="2026-04-16 20:12:13.08727496 +0000 UTC m=+20.748872450" watchObservedRunningTime="2026-04-16 20:12:13.087388436 +0000 UTC m=+20.748985925" Apr 16 20:12:13.104858 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.104816 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-k66bc" podStartSLOduration=2.979268657 podStartE2EDuration="20.104801116s" podCreationTimestamp="2026-04-16 20:11:53 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.529824681 +0000 UTC m=+3.191422162" lastFinishedPulling="2026-04-16 20:12:12.655357152 +0000 UTC m=+20.316954621" observedRunningTime="2026-04-16 20:12:13.104300773 +0000 UTC m=+20.765898261" watchObservedRunningTime="2026-04-16 20:12:13.104801116 +0000 UTC m=+20.766398605" Apr 16 20:12:13.120568 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.120526 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7sgxp" podStartSLOduration=3.028673429 podStartE2EDuration="20.120513931s" podCreationTimestamp="2026-04-16 20:11:53 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.530304836 +0000 UTC m=+3.191902303" lastFinishedPulling="2026-04-16 20:12:12.622145339 +0000 UTC m=+20.283742805" observedRunningTime="2026-04-16 20:12:13.120369745 +0000 UTC m=+20.781967234" watchObservedRunningTime="2026-04-16 20:12:13.120513931 +0000 UTC m=+20.782111419" Apr 16 20:12:13.134676 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.134557 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-f5hp5" podStartSLOduration=3.038004918 podStartE2EDuration="20.13453792s" podCreationTimestamp="2026-04-16 20:11:53 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.525796718 +0000 UTC m=+3.187394185" lastFinishedPulling="2026-04-16 20:12:12.622329706 +0000 UTC m=+20.283927187" observedRunningTime="2026-04-16 20:12:13.133925214 +0000 UTC m=+20.795522703" watchObservedRunningTime="2026-04-16 20:12:13.13453792 +0000 UTC m=+20.796135428" Apr 16 20:12:13.149502 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.149457 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pdzfv" podStartSLOduration=8.873769029 podStartE2EDuration="21.149444798s" podCreationTimestamp="2026-04-16 20:11:52 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.528768031 +0000 UTC m=+3.190365501" lastFinishedPulling="2026-04-16 20:12:07.8044438 +0000 UTC m=+15.466041270" observedRunningTime="2026-04-16 20:12:13.149170693 +0000 UTC m=+20.810768183" watchObservedRunningTime="2026-04-16 20:12:13.149444798 +0000 UTC m=+20.811042287" Apr 16 20:12:13.946203 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.945978 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:13.946342 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:13.946321 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:13.947504 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:13.947485 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:12:14.060695 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:14.060662 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-42vjt" event={"ID":"c7d8e3b9-d6d9-447c-91b1-b9d4184f699e","Type":"ContainerStarted","Data":"d70e21cd2dcc59a7f502f84a17985a62357c38db30ad93818918bbfaec7cd621"} Apr 16 20:12:14.062209 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:14.062190 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" event={"ID":"4fc1e536-079c-4bb9-9eb5-6e211948cd4f","Type":"ContainerStarted","Data":"44d37a3051be59ff057346eaecf21bfb704ad50e09f80fbcc08ba2c8da1c4b05"} Apr 16 20:12:14.063508 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:14.063485 2569 generic.go:358] "Generic (PLEG): container finished" podID="01eeb0f7-ee5b-44af-ab8f-b3296ae5b886" containerID="0d98395ce90df7edb3158189b57f2347a30e5ba8e964a7058b574344c1111bc1" exitCode=0 Apr 16 20:12:14.063587 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:14.063563 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zmm9" event={"ID":"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886","Type":"ContainerDied","Data":"0d98395ce90df7edb3158189b57f2347a30e5ba8e964a7058b574344c1111bc1"} Apr 16 20:12:14.066423 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:14.066399 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" event={"ID":"0cbc952a-810f-46b7-b791-bccdd61ac1b4","Type":"ContainerStarted","Data":"6e0fb689bda0f57265f8d41a4189c6b3d0912780c8629312bd9c4a77595a8cb6"} Apr 16 20:12:14.066505 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:14.066432 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" event={"ID":"0cbc952a-810f-46b7-b791-bccdd61ac1b4","Type":"ContainerStarted","Data":"62b701aa00dbdecf40399f39312219e02881ff7572d45ef1430f3fa2ced2cfab"} Apr 16 20:12:14.066505 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:14.066458 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" event={"ID":"0cbc952a-810f-46b7-b791-bccdd61ac1b4","Type":"ContainerStarted","Data":"9e271f72eb3ce52f413e27e3b88fa1358e1856e4983b0ba27aa2eed66287128c"} Apr 16 20:12:14.066505 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:14.066468 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" event={"ID":"0cbc952a-810f-46b7-b791-bccdd61ac1b4","Type":"ContainerStarted","Data":"e6423a02d022d2b418c64a87ec64a0f067cb7f9c5303ad5aadea47f9ca798afb"} Apr 16 20:12:14.074345 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:14.074305 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-42vjt" podStartSLOduration=3.974802356 podStartE2EDuration="21.07429334s" podCreationTimestamp="2026-04-16 20:11:53 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.523038229 +0000 UTC m=+3.184635699" lastFinishedPulling="2026-04-16 20:12:12.622529208 +0000 UTC m=+20.284126683" observedRunningTime="2026-04-16 20:12:14.074020666 +0000 UTC m=+21.735618155" watchObservedRunningTime="2026-04-16 20:12:14.07429334 +0000 UTC m=+21.735890828" Apr 16 20:12:14.882761 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:14.882607 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:12:13.94749977Z","UUID":"bf069594-2003-4f6d-a285-c5380755178b","Handler":null,"Name":"","Endpoint":""} Apr 16 20:12:14.884482 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:14.884449 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:12:14.884589 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:14.884493 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:12:14.945818 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:14.945787 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:14.945968 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:14.945907 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:15.070063 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:15.070012 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" event={"ID":"4fc1e536-079c-4bb9-9eb5-6e211948cd4f","Type":"ContainerStarted","Data":"6383f9d67420d3f677156791b492f69edd8762084b4608043d597e7f1cc7330b"} Apr 16 20:12:15.091448 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:15.091404 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cgsnx" podStartSLOduration=3.018294429 podStartE2EDuration="22.091387411s" podCreationTimestamp="2026-04-16 20:11:53 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.527619479 +0000 UTC m=+3.189216961" lastFinishedPulling="2026-04-16 20:12:14.600712475 +0000 UTC m=+22.262309943" observedRunningTime="2026-04-16 20:12:15.091006592 +0000 UTC m=+22.752604075" watchObservedRunningTime="2026-04-16 20:12:15.091387411 +0000 UTC m=+22.752984902" Apr 16 20:12:15.945943 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:15.945907 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:15.946136 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:15.946062 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:16.074762 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:16.074722 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" event={"ID":"0cbc952a-810f-46b7-b791-bccdd61ac1b4","Type":"ContainerStarted","Data":"ffbd68ae273b4c0bafaa632e45ce98711754257bd815aa0c70ee7bc4dd74c5f3"} Apr 16 20:12:16.946166 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:16.946137 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:16.946348 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:16.946258 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:17.526932 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:17.526894 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pdzfv" Apr 16 20:12:17.527851 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:17.527834 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pdzfv" Apr 16 20:12:17.945576 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:17.945542 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:17.945728 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:17.945655 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:18.079665 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:18.079636 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pdzfv" Apr 16 20:12:18.080754 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:18.080736 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pdzfv" Apr 16 20:12:18.945841 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:18.945661 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:18.946271 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:18.945911 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:19.082275 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:19.082243 2569 generic.go:358] "Generic (PLEG): container finished" podID="01eeb0f7-ee5b-44af-ab8f-b3296ae5b886" containerID="ff2457863ada539081d52d226c93dd38bd45695deea496f16aeb62643768d3ad" exitCode=0 Apr 16 20:12:19.082440 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:19.082329 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zmm9" event={"ID":"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886","Type":"ContainerDied","Data":"ff2457863ada539081d52d226c93dd38bd45695deea496f16aeb62643768d3ad"} Apr 16 20:12:19.085440 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:19.085413 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" event={"ID":"0cbc952a-810f-46b7-b791-bccdd61ac1b4","Type":"ContainerStarted","Data":"fdb60c56d2cba87b0f07a28b5ff9b9b40fe059195643b5249ca63a5e1a488bfc"} Apr 16 20:12:19.085820 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:19.085804 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:12:19.099978 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:19.099956 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:12:19.139979 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:19.139939 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" podStartSLOduration=8.944162627 podStartE2EDuration="26.139928446s" podCreationTimestamp="2026-04-16 20:11:53 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.522581322 +0000 UTC m=+3.184178803" lastFinishedPulling="2026-04-16 20:12:12.718347143 +0000 UTC m=+20.379944622" observedRunningTime="2026-04-16 20:12:19.136622298 +0000 UTC m=+26.798219787" watchObservedRunningTime="2026-04-16 20:12:19.139928446 +0000 UTC m=+26.801525935" Apr 16 20:12:19.945600 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:19.945574 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:19.945736 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:19.945674 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:20.087880 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:20.087849 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:12:20.087880 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:20.087884 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:12:20.101330 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:20.101306 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:12:20.946176 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:20.946144 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:20.946388 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:20.946257 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:21.090538 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:21.090497 2569 generic.go:358] "Generic (PLEG): container finished" podID="01eeb0f7-ee5b-44af-ab8f-b3296ae5b886" containerID="99e4a21c4c07403ff5c594c87082dbbbe28817c705462c016044081baa55bffb" exitCode=0 Apr 16 20:12:21.090920 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:21.090592 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zmm9" event={"ID":"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886","Type":"ContainerDied","Data":"99e4a21c4c07403ff5c594c87082dbbbe28817c705462c016044081baa55bffb"} Apr 16 20:12:21.945498 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:21.945286 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:21.945721 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:21.945589 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:22.946393 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:22.946365 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:22.946742 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:22.946447 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:23.096052 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:23.096015 2569 generic.go:358] "Generic (PLEG): container finished" podID="01eeb0f7-ee5b-44af-ab8f-b3296ae5b886" containerID="8884c064fd404f11acc3f985dbf0943aea5b4b7c336fdb9f3b732f8ec98d9d80" exitCode=0 Apr 16 20:12:23.096241 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:23.096069 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zmm9" event={"ID":"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886","Type":"ContainerDied","Data":"8884c064fd404f11acc3f985dbf0943aea5b4b7c336fdb9f3b732f8ec98d9d80"} Apr 16 20:12:23.945754 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:23.945718 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:23.945909 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:23.945865 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:24.945260 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:24.945228 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:24.945681 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:24.945342 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:25.945578 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:25.945542 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:25.946041 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:25.945692 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:26.578790 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:26.578755 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs\") pod \"network-metrics-daemon-ck2ww\" (UID: \"f2865cec-958e-49f5-9bd1-57d8fbb3fefc\") " pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:26.579019 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:26.578924 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:26.579019 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:26.579008 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs podName:f2865cec-958e-49f5-9bd1-57d8fbb3fefc nodeName:}" failed. No retries permitted until 2026-04-16 20:12:58.578986462 +0000 UTC m=+66.240583929 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs") pod "network-metrics-daemon-ck2ww" (UID: "f2865cec-958e-49f5-9bd1-57d8fbb3fefc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:26.679534 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:26.679496 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbc2l\" (UniqueName: \"kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l\") pod \"network-check-target-bg95d\" (UID: \"4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f\") " pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:26.679703 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:26.679679 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:26.679703 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:26.679702 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:26.679826 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:26.679713 2569 projected.go:194] Error preparing data for projected volume kube-api-access-qbc2l for pod openshift-network-diagnostics/network-check-target-bg95d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:26.679826 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:26.679781 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l podName:4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f nodeName:}" failed. No retries permitted until 2026-04-16 20:12:58.679760604 +0000 UTC m=+66.341358073 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-qbc2l" (UniqueName: "kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l") pod "network-check-target-bg95d" (UID: "4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:26.945472 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:26.945389 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:26.945623 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:26.945506 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:27.945371 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:27.945340 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:27.945529 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:27.945463 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:28.398957 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:28.398919 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ck2ww"] Apr 16 20:12:28.399753 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:28.399080 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:28.399753 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:28.399227 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:28.401410 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:28.401108 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bg95d"] Apr 16 20:12:28.401410 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:28.401251 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:28.401410 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:28.401358 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:29.110770 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:29.110581 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zmm9" event={"ID":"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886","Type":"ContainerStarted","Data":"fe7fea11562a1e24d1a18bd9ea5c258fe4a6b2d1207bd14cb8572dda0beb3d6a"} Apr 16 20:12:29.945629 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:29.945592 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:29.946006 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:29.945598 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:29.946006 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:29.945696 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg95d" podUID="4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f" Apr 16 20:12:29.946006 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:29.945779 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ck2ww" podUID="f2865cec-958e-49f5-9bd1-57d8fbb3fefc" Apr 16 20:12:30.114996 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:30.114965 2569 generic.go:358] "Generic (PLEG): container finished" podID="01eeb0f7-ee5b-44af-ab8f-b3296ae5b886" containerID="fe7fea11562a1e24d1a18bd9ea5c258fe4a6b2d1207bd14cb8572dda0beb3d6a" exitCode=0 Apr 16 20:12:30.115142 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:30.115024 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zmm9" event={"ID":"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886","Type":"ContainerDied","Data":"fe7fea11562a1e24d1a18bd9ea5c258fe4a6b2d1207bd14cb8572dda0beb3d6a"} Apr 16 20:12:31.119306 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.119267 2569 generic.go:358] "Generic (PLEG): container finished" podID="01eeb0f7-ee5b-44af-ab8f-b3296ae5b886" containerID="edf7947a2b222299f91c69bfe38f95ea891389aab6bb40935a07a2bf4ecdf445" exitCode=0 Apr 16 20:12:31.119701 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.119354 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zmm9" event={"ID":"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886","Type":"ContainerDied","Data":"edf7947a2b222299f91c69bfe38f95ea891389aab6bb40935a07a2bf4ecdf445"} Apr 16 20:12:31.703936 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.703865 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-182.ec2.internal" event="NodeReady" Apr 16 20:12:31.704079 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.703981 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:12:31.736251 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.736203 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-658f999b89-l7k42"] Apr 16 20:12:31.757761 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.757738 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-nzqv4"] Apr 16 20:12:31.757912 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.757895 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:31.760068 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.760043 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 20:12:31.760201 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.760064 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ksxbp\"" Apr 16 20:12:31.760201 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.760138 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 20:12:31.760201 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.760140 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 20:12:31.764728 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.764707 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 20:12:31.781626 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.781600 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zf7nk"] Apr 16 20:12:31.781782 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.781766 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:31.784607 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.784584 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:12:31.784721 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.784613 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:12:31.784721 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.784686 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:12:31.785017 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.784995 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:12:31.785465 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.785440 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bmglr\"" Apr 16 20:12:31.803977 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.803956 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-658f999b89-l7k42"] Apr 16 20:12:31.804068 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.803993 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nzqv4"] Apr 16 20:12:31.804068 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.804006 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zf7nk"] Apr 16 20:12:31.804068 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.804025 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:31.806207 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.806189 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:12:31.806439 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.806422 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:12:31.806548 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.806533 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tnjdz\"" Apr 16 20:12:31.854436 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.854413 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sxqhm"] Apr 16 20:12:31.882755 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.882730 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sxqhm"] Apr 16 20:12:31.882864 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.882829 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sxqhm" Apr 16 20:12:31.885062 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.885045 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:12:31.885159 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.885114 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:12:31.885159 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.885114 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-v7v7t\"" Apr 16 20:12:31.885278 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.885252 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:12:31.922762 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.922740 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43c8d38b-37ab-4156-91da-345b7bf10494-ca-trust-extracted\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:31.922873 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.922768 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ad308773-bd17-4488-9a1d-78314d278c1a-crio-socket\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:31.922873 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.922788 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b54ln\" (UniqueName: \"kubernetes.io/projected/ad308773-bd17-4488-9a1d-78314d278c1a-kube-api-access-b54ln\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:31.922873 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.922829 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5be8917-e623-43fd-9af9-3ccbaba1d169-tmp-dir\") pod \"dns-default-zf7nk\" (UID: \"c5be8917-e623-43fd-9af9-3ccbaba1d169\") " pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:31.922970 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.922888 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5be8917-e623-43fd-9af9-3ccbaba1d169-metrics-tls\") pod \"dns-default-zf7nk\" (UID: \"c5be8917-e623-43fd-9af9-3ccbaba1d169\") " pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:31.922970 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.922909 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prp2n\" (UniqueName: \"kubernetes.io/projected/c5be8917-e623-43fd-9af9-3ccbaba1d169-kube-api-access-prp2n\") pod \"dns-default-zf7nk\" (UID: \"c5be8917-e623-43fd-9af9-3ccbaba1d169\") " pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:31.922970 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.922938 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43c8d38b-37ab-4156-91da-345b7bf10494-registry-certificates\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:31.922970 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.922959 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ad308773-bd17-4488-9a1d-78314d278c1a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:31.923089 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.922977 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43c8d38b-37ab-4156-91da-345b7bf10494-image-registry-private-configuration\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:31.923089 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.922992 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43c8d38b-37ab-4156-91da-345b7bf10494-registry-tls\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:31.923089 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.923029 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5be8917-e623-43fd-9af9-3ccbaba1d169-config-volume\") pod \"dns-default-zf7nk\" (UID: \"c5be8917-e623-43fd-9af9-3ccbaba1d169\") " pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:31.923089 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.923047 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ad308773-bd17-4488-9a1d-78314d278c1a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:31.923089 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.923070 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ad308773-bd17-4488-9a1d-78314d278c1a-data-volume\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:31.923263 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.923090 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp9cg\" (UniqueName: \"kubernetes.io/projected/43c8d38b-37ab-4156-91da-345b7bf10494-kube-api-access-dp9cg\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:31.923263 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.923154 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43c8d38b-37ab-4156-91da-345b7bf10494-bound-sa-token\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:31.923263 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.923228 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43c8d38b-37ab-4156-91da-345b7bf10494-trusted-ca\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:31.923263 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.923259 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43c8d38b-37ab-4156-91da-345b7bf10494-installation-pull-secrets\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:31.945973 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.945953 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:31.945973 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.945965 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:31.948593 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.948577 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jm2hn\"" Apr 16 20:12:31.948686 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.948581 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-b2xrn\"" Apr 16 20:12:31.948770 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.948757 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:12:31.948871 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.948856 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:12:31.948932 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:31.948857 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:12:32.024199 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024171 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43c8d38b-37ab-4156-91da-345b7bf10494-trusted-ca\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.024383 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024206 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79g74\" (UniqueName: \"kubernetes.io/projected/7b290614-0660-49a1-a6ac-a56ab51a99b4-kube-api-access-79g74\") pod \"ingress-canary-sxqhm\" (UID: \"7b290614-0660-49a1-a6ac-a56ab51a99b4\") " pod="openshift-ingress-canary/ingress-canary-sxqhm" Apr 16 20:12:32.024383 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024258 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43c8d38b-37ab-4156-91da-345b7bf10494-installation-pull-secrets\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.024383 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024277 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43c8d38b-37ab-4156-91da-345b7bf10494-ca-trust-extracted\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.024383 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024291 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ad308773-bd17-4488-9a1d-78314d278c1a-crio-socket\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:32.024383 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024307 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b54ln\" (UniqueName: \"kubernetes.io/projected/ad308773-bd17-4488-9a1d-78314d278c1a-kube-api-access-b54ln\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:32.024632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024417 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5be8917-e623-43fd-9af9-3ccbaba1d169-tmp-dir\") pod \"dns-default-zf7nk\" (UID: \"c5be8917-e623-43fd-9af9-3ccbaba1d169\") " pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:32.024632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024473 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5be8917-e623-43fd-9af9-3ccbaba1d169-metrics-tls\") pod \"dns-default-zf7nk\" (UID: \"c5be8917-e623-43fd-9af9-3ccbaba1d169\") " pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:32.024632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024495 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prp2n\" (UniqueName: \"kubernetes.io/projected/c5be8917-e623-43fd-9af9-3ccbaba1d169-kube-api-access-prp2n\") pod \"dns-default-zf7nk\" (UID: \"c5be8917-e623-43fd-9af9-3ccbaba1d169\") " pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:32.024632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024525 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43c8d38b-37ab-4156-91da-345b7bf10494-registry-certificates\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.024632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ad308773-bd17-4488-9a1d-78314d278c1a-crio-socket\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:32.024632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024551 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ad308773-bd17-4488-9a1d-78314d278c1a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:32.024632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024581 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43c8d38b-37ab-4156-91da-345b7bf10494-image-registry-private-configuration\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.024632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024606 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43c8d38b-37ab-4156-91da-345b7bf10494-registry-tls\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.025021 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024666 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5be8917-e623-43fd-9af9-3ccbaba1d169-config-volume\") pod \"dns-default-zf7nk\" (UID: \"c5be8917-e623-43fd-9af9-3ccbaba1d169\") " pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:32.025021 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024689 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ad308773-bd17-4488-9a1d-78314d278c1a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:32.025021 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024726 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ad308773-bd17-4488-9a1d-78314d278c1a-data-volume\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:32.025021 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024751 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dp9cg\" (UniqueName: \"kubernetes.io/projected/43c8d38b-37ab-4156-91da-345b7bf10494-kube-api-access-dp9cg\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.025021 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024772 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5be8917-e623-43fd-9af9-3ccbaba1d169-tmp-dir\") pod \"dns-default-zf7nk\" (UID: \"c5be8917-e623-43fd-9af9-3ccbaba1d169\") " pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:32.025294 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.025072 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ad308773-bd17-4488-9a1d-78314d278c1a-data-volume\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:32.025436 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.025413 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ad308773-bd17-4488-9a1d-78314d278c1a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:32.025495 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.025427 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43c8d38b-37ab-4156-91da-345b7bf10494-registry-certificates\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.025495 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.025438 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43c8d38b-37ab-4156-91da-345b7bf10494-ca-trust-extracted\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.025495 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.024780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43c8d38b-37ab-4156-91da-345b7bf10494-bound-sa-token\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.025639 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.025496 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b290614-0660-49a1-a6ac-a56ab51a99b4-cert\") pod \"ingress-canary-sxqhm\" (UID: \"7b290614-0660-49a1-a6ac-a56ab51a99b4\") " pod="openshift-ingress-canary/ingress-canary-sxqhm" Apr 16 20:12:32.025639 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.025594 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43c8d38b-37ab-4156-91da-345b7bf10494-trusted-ca\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.026040 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.025888 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5be8917-e623-43fd-9af9-3ccbaba1d169-config-volume\") pod \"dns-default-zf7nk\" (UID: \"c5be8917-e623-43fd-9af9-3ccbaba1d169\") " pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:32.028962 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.028942 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ad308773-bd17-4488-9a1d-78314d278c1a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:32.028962 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.028955 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5be8917-e623-43fd-9af9-3ccbaba1d169-metrics-tls\") pod \"dns-default-zf7nk\" (UID: \"c5be8917-e623-43fd-9af9-3ccbaba1d169\") " pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:32.029111 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.028988 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43c8d38b-37ab-4156-91da-345b7bf10494-registry-tls\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.029111 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.029004 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43c8d38b-37ab-4156-91da-345b7bf10494-installation-pull-secrets\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.029373 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.029353 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43c8d38b-37ab-4156-91da-345b7bf10494-image-registry-private-configuration\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.032359 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.032335 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prp2n\" (UniqueName: \"kubernetes.io/projected/c5be8917-e623-43fd-9af9-3ccbaba1d169-kube-api-access-prp2n\") pod \"dns-default-zf7nk\" (UID: \"c5be8917-e623-43fd-9af9-3ccbaba1d169\") " pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:32.032451 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.032392 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b54ln\" (UniqueName: \"kubernetes.io/projected/ad308773-bd17-4488-9a1d-78314d278c1a-kube-api-access-b54ln\") pod \"insights-runtime-extractor-nzqv4\" (UID: \"ad308773-bd17-4488-9a1d-78314d278c1a\") " pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:32.032451 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.032423 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp9cg\" (UniqueName: \"kubernetes.io/projected/43c8d38b-37ab-4156-91da-345b7bf10494-kube-api-access-dp9cg\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.032816 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.032797 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43c8d38b-37ab-4156-91da-345b7bf10494-bound-sa-token\") pod \"image-registry-658f999b89-l7k42\" (UID: \"43c8d38b-37ab-4156-91da-345b7bf10494\") " pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.068796 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.068770 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:32.091533 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.091503 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nzqv4" Apr 16 20:12:32.112385 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.112351 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:32.124997 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.124907 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zmm9" event={"ID":"01eeb0f7-ee5b-44af-ab8f-b3296ae5b886","Type":"ContainerStarted","Data":"51eaf039f102bd7798166cb70088be0cc8d465d8f28985fec5e730679ee80786"} Apr 16 20:12:32.125775 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.125755 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b290614-0660-49a1-a6ac-a56ab51a99b4-cert\") pod \"ingress-canary-sxqhm\" (UID: \"7b290614-0660-49a1-a6ac-a56ab51a99b4\") " pod="openshift-ingress-canary/ingress-canary-sxqhm" Apr 16 20:12:32.125884 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.125793 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79g74\" (UniqueName: \"kubernetes.io/projected/7b290614-0660-49a1-a6ac-a56ab51a99b4-kube-api-access-79g74\") pod \"ingress-canary-sxqhm\" (UID: \"7b290614-0660-49a1-a6ac-a56ab51a99b4\") " pod="openshift-ingress-canary/ingress-canary-sxqhm" Apr 16 20:12:32.129086 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.129061 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b290614-0660-49a1-a6ac-a56ab51a99b4-cert\") pod \"ingress-canary-sxqhm\" (UID: \"7b290614-0660-49a1-a6ac-a56ab51a99b4\") " pod="openshift-ingress-canary/ingress-canary-sxqhm" Apr 16 20:12:32.133068 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.133041 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79g74\" (UniqueName: \"kubernetes.io/projected/7b290614-0660-49a1-a6ac-a56ab51a99b4-kube-api-access-79g74\") pod \"ingress-canary-sxqhm\" (UID: \"7b290614-0660-49a1-a6ac-a56ab51a99b4\") " pod="openshift-ingress-canary/ingress-canary-sxqhm" Apr 16 20:12:32.146620 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.146574 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6zmm9" podStartSLOduration=5.762899633 podStartE2EDuration="39.146561278s" podCreationTimestamp="2026-04-16 20:11:53 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.523953891 +0000 UTC m=+3.185551373" lastFinishedPulling="2026-04-16 20:12:28.907615552 +0000 UTC m=+36.569213018" observedRunningTime="2026-04-16 20:12:32.14511663 +0000 UTC m=+39.806714118" watchObservedRunningTime="2026-04-16 20:12:32.146561278 +0000 UTC m=+39.808158766" Apr 16 20:12:32.190962 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.190931 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sxqhm" Apr 16 20:12:32.304166 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.304133 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zf7nk"] Apr 16 20:12:32.308486 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.308437 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nzqv4"] Apr 16 20:12:32.313871 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:12:32.313772 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad308773_bd17_4488_9a1d_78314d278c1a.slice/crio-7d856fe0e287151c6d93abb12219ec0721f2d5c69b926ca85c5adc36bda81af9 WatchSource:0}: Error finding container 7d856fe0e287151c6d93abb12219ec0721f2d5c69b926ca85c5adc36bda81af9: Status 404 returned error can't find the container with id 7d856fe0e287151c6d93abb12219ec0721f2d5c69b926ca85c5adc36bda81af9 Apr 16 20:12:32.315538 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.315512 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-658f999b89-l7k42"] Apr 16 20:12:32.322852 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:12:32.322827 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43c8d38b_37ab_4156_91da_345b7bf10494.slice/crio-5fe97ccaa393d11883a02c65ec6bcebdc43dad1472cd855cb8dd429d594f9424 WatchSource:0}: Error finding container 5fe97ccaa393d11883a02c65ec6bcebdc43dad1472cd855cb8dd429d594f9424: Status 404 returned error can't find the container with id 5fe97ccaa393d11883a02c65ec6bcebdc43dad1472cd855cb8dd429d594f9424 Apr 16 20:12:32.347519 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:32.347478 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sxqhm"] Apr 16 20:12:32.352323 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:12:32.352296 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b290614_0660_49a1_a6ac_a56ab51a99b4.slice/crio-0732b063f7c705f3c8d7aad608c9e468df75cda82a0e17bcd7af5d415db4ac77 WatchSource:0}: Error finding container 0732b063f7c705f3c8d7aad608c9e468df75cda82a0e17bcd7af5d415db4ac77: Status 404 returned error can't find the container with id 0732b063f7c705f3c8d7aad608c9e468df75cda82a0e17bcd7af5d415db4ac77 Apr 16 20:12:33.128127 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:33.128087 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sxqhm" event={"ID":"7b290614-0660-49a1-a6ac-a56ab51a99b4","Type":"ContainerStarted","Data":"0732b063f7c705f3c8d7aad608c9e468df75cda82a0e17bcd7af5d415db4ac77"} Apr 16 20:12:33.129069 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:33.129045 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zf7nk" event={"ID":"c5be8917-e623-43fd-9af9-3ccbaba1d169","Type":"ContainerStarted","Data":"4dab2cbf68a8faa7f450a64c92f6128b8a0e541d9fbd61c159d9ec895800f62d"} Apr 16 20:12:33.130456 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:33.130430 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nzqv4" event={"ID":"ad308773-bd17-4488-9a1d-78314d278c1a","Type":"ContainerStarted","Data":"40dab4ba8589471a7943c0aee6d16bfdf08285029c7e15688df8c5c714125d60"} Apr 16 20:12:33.130553 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:33.130464 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nzqv4" event={"ID":"ad308773-bd17-4488-9a1d-78314d278c1a","Type":"ContainerStarted","Data":"7d856fe0e287151c6d93abb12219ec0721f2d5c69b926ca85c5adc36bda81af9"} Apr 16 20:12:33.131954 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:33.131928 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-658f999b89-l7k42" event={"ID":"43c8d38b-37ab-4156-91da-345b7bf10494","Type":"ContainerStarted","Data":"db92687959a7835254c71ab8f1a692a19578c05022dd0cb8a3e864ef4bc12651"} Apr 16 20:12:33.132048 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:33.131955 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-658f999b89-l7k42" event={"ID":"43c8d38b-37ab-4156-91da-345b7bf10494","Type":"ContainerStarted","Data":"5fe97ccaa393d11883a02c65ec6bcebdc43dad1472cd855cb8dd429d594f9424"} Apr 16 20:12:33.132148 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:33.132134 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:33.168208 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:33.168159 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-658f999b89-l7k42" podStartSLOduration=6.168139891 podStartE2EDuration="6.168139891s" podCreationTimestamp="2026-04-16 20:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:33.167119233 +0000 UTC m=+40.828716767" watchObservedRunningTime="2026-04-16 20:12:33.168139891 +0000 UTC m=+40.829737385" Apr 16 20:12:34.136007 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:34.135970 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nzqv4" event={"ID":"ad308773-bd17-4488-9a1d-78314d278c1a","Type":"ContainerStarted","Data":"d0afe1f57349197679e8fee36fbe738efce78328be888cdccea2120845cbc840"} Apr 16 20:12:35.141303 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:35.141060 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sxqhm" event={"ID":"7b290614-0660-49a1-a6ac-a56ab51a99b4","Type":"ContainerStarted","Data":"fa03c6b00aa7cc90e8b13f1f33f7548a0cc36d5396844e157c32503a53257082"} Apr 16 20:12:35.142811 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:35.142780 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zf7nk" event={"ID":"c5be8917-e623-43fd-9af9-3ccbaba1d169","Type":"ContainerStarted","Data":"4cb1b13cf3b7f9161a4d898b4c48d9562f732c6eebdba5835db8aee517ada62e"} Apr 16 20:12:35.144834 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:35.144811 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nzqv4" event={"ID":"ad308773-bd17-4488-9a1d-78314d278c1a","Type":"ContainerStarted","Data":"9edd9eab939dc0e33353160aa38c6540ea4468b467dfe2deb987941c552e030f"} Apr 16 20:12:35.156523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:35.156469 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sxqhm" podStartSLOduration=1.501932424 podStartE2EDuration="4.15645232s" podCreationTimestamp="2026-04-16 20:12:31 +0000 UTC" firstStartedPulling="2026-04-16 20:12:32.354090958 +0000 UTC m=+40.015688425" lastFinishedPulling="2026-04-16 20:12:35.008610848 +0000 UTC m=+42.670208321" observedRunningTime="2026-04-16 20:12:35.155844925 +0000 UTC m=+42.817442415" watchObservedRunningTime="2026-04-16 20:12:35.15645232 +0000 UTC m=+42.818049810" Apr 16 20:12:35.174844 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:35.174779 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-nzqv4" podStartSLOduration=1.573777635 podStartE2EDuration="4.174760456s" podCreationTimestamp="2026-04-16 20:12:31 +0000 UTC" firstStartedPulling="2026-04-16 20:12:32.40866853 +0000 UTC m=+40.070266011" lastFinishedPulling="2026-04-16 20:12:35.00965135 +0000 UTC m=+42.671248832" observedRunningTime="2026-04-16 20:12:35.17159387 +0000 UTC m=+42.833191360" watchObservedRunningTime="2026-04-16 20:12:35.174760456 +0000 UTC m=+42.836357945" Apr 16 20:12:36.149598 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:36.149558 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zf7nk" event={"ID":"c5be8917-e623-43fd-9af9-3ccbaba1d169","Type":"ContainerStarted","Data":"fd20063edc7123afb47adb18602587f10ee1e183914a268ddbddfef961d63b90"} Apr 16 20:12:36.166621 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:36.166571 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zf7nk" podStartSLOduration=2.474389662 podStartE2EDuration="5.166557602s" podCreationTimestamp="2026-04-16 20:12:31 +0000 UTC" firstStartedPulling="2026-04-16 20:12:32.310318238 +0000 UTC m=+39.971915714" lastFinishedPulling="2026-04-16 20:12:35.002486186 +0000 UTC m=+42.664083654" observedRunningTime="2026-04-16 20:12:36.165519947 +0000 UTC m=+43.827117436" watchObservedRunningTime="2026-04-16 20:12:36.166557602 +0000 UTC m=+43.828155114" Apr 16 20:12:37.152490 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:37.152457 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:42.997933 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:42.997904 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c869b96f4-s2lgn"] Apr 16 20:12:43.000477 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.000455 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.003318 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.003298 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 20:12:43.003816 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.003798 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 20:12:43.004013 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.003868 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 20:12:43.004013 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.003984 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 20:12:43.004174 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.004019 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qtq7x\"" Apr 16 20:12:43.004174 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.004032 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 20:12:43.004174 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.004104 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 20:12:43.004765 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.004748 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 20:12:43.008714 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.008695 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 20:12:43.016550 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.016529 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c869b96f4-s2lgn"] Apr 16 20:12:43.093293 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.093265 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-trusted-ca-bundle\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.093293 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.093296 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn7qn\" (UniqueName: \"kubernetes.io/projected/d103b181-6102-40b7-8f86-fc8adc683c01-kube-api-access-zn7qn\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.093496 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.093318 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d103b181-6102-40b7-8f86-fc8adc683c01-console-serving-cert\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.093496 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.093368 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-service-ca\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.093496 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.093394 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d103b181-6102-40b7-8f86-fc8adc683c01-console-oauth-config\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.093496 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.093412 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-console-config\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.093496 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.093436 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-oauth-serving-cert\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.194314 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.194268 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d103b181-6102-40b7-8f86-fc8adc683c01-console-oauth-config\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.194314 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.194313 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-console-config\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.194314 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.194333 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-oauth-serving-cert\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.194614 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.194371 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-trusted-ca-bundle\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.194614 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.194391 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zn7qn\" (UniqueName: \"kubernetes.io/projected/d103b181-6102-40b7-8f86-fc8adc683c01-kube-api-access-zn7qn\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.194614 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.194411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d103b181-6102-40b7-8f86-fc8adc683c01-console-serving-cert\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.194752 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.194629 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-service-ca\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.195128 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.195097 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-console-config\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.195267 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.195124 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-oauth-serving-cert\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.195267 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.195201 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-service-ca\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.195368 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.195350 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-trusted-ca-bundle\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.198378 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.198358 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d103b181-6102-40b7-8f86-fc8adc683c01-console-oauth-config\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.198479 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.198431 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d103b181-6102-40b7-8f86-fc8adc683c01-console-serving-cert\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.204561 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.204536 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn7qn\" (UniqueName: \"kubernetes.io/projected/d103b181-6102-40b7-8f86-fc8adc683c01-kube-api-access-zn7qn\") pod \"console-c869b96f4-s2lgn\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.310322 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.310238 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:43.429374 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:43.429195 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c869b96f4-s2lgn"] Apr 16 20:12:43.434414 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:12:43.434382 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd103b181_6102_40b7_8f86_fc8adc683c01.slice/crio-f925d4b3021ea055c8c617f00e65307b1d73e3a9c82aee0c0603d2cbc252e1e4 WatchSource:0}: Error finding container f925d4b3021ea055c8c617f00e65307b1d73e3a9c82aee0c0603d2cbc252e1e4: Status 404 returned error can't find the container with id f925d4b3021ea055c8c617f00e65307b1d73e3a9c82aee0c0603d2cbc252e1e4 Apr 16 20:12:44.170930 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:44.170888 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c869b96f4-s2lgn" event={"ID":"d103b181-6102-40b7-8f86-fc8adc683c01","Type":"ContainerStarted","Data":"f925d4b3021ea055c8c617f00e65307b1d73e3a9c82aee0c0603d2cbc252e1e4"} Apr 16 20:12:45.193340 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.193302 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2"] Apr 16 20:12:45.199792 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.199771 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.202407 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.202381 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-n7pmb\"" Apr 16 20:12:45.202677 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.202659 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 20:12:45.203256 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.203235 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:12:45.203356 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.203342 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:12:45.203633 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.203615 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:12:45.203735 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.203640 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 20:12:45.206279 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.204508 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-nbdcc"] Apr 16 20:12:45.211474 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.211450 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5rbs2\" (UID: \"61e1abe3-3b21-40a0-9ec5-c4dbf542029e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.211580 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.211500 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5rbs2\" (UID: \"61e1abe3-3b21-40a0-9ec5-c4dbf542029e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.211580 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.211551 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5rbs2\" (UID: \"61e1abe3-3b21-40a0-9ec5-c4dbf542029e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.211695 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.211654 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjkcg\" (UniqueName: \"kubernetes.io/projected/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-kube-api-access-vjkcg\") pod \"openshift-state-metrics-9d44df66c-5rbs2\" (UID: \"61e1abe3-3b21-40a0-9ec5-c4dbf542029e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.214373 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.214353 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2"] Apr 16 20:12:45.214499 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.214486 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.216841 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.216812 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 20:12:45.216940 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.216885 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-2jtlf\"" Apr 16 20:12:45.217016 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.216996 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 20:12:45.217084 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.217066 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 20:12:45.221753 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.221714 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-nbdcc"] Apr 16 20:12:45.229715 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.229696 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ftphk"] Apr 16 20:12:45.235389 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.235368 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.237852 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.237832 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:12:45.238236 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.237925 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:12:45.238236 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.237975 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:12:45.238400 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.238384 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6m8jn\"" Apr 16 20:12:45.312545 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312515 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjkcg\" (UniqueName: \"kubernetes.io/projected/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-kube-api-access-vjkcg\") pod \"openshift-state-metrics-9d44df66c-5rbs2\" (UID: \"61e1abe3-3b21-40a0-9ec5-c4dbf542029e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.312545 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312555 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.312755 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312597 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f649df33-f768-4559-8efb-4679bd198e57-metrics-client-ca\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.312755 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312632 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5rbs2\" (UID: \"61e1abe3-3b21-40a0-9ec5-c4dbf542029e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.312755 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312649 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.312755 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312672 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6wws\" (UniqueName: \"kubernetes.io/projected/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-api-access-v6wws\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.312755 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312697 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-wtmp\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.312755 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312727 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.312755 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312753 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.313055 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:45.312762 2569 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 20:12:45.313055 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312787 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-accelerators-collector-config\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.313055 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312812 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldmb4\" (UniqueName: \"kubernetes.io/projected/f649df33-f768-4559-8efb-4679bd198e57-kube-api-access-ldmb4\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.313055 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:45.312829 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-openshift-state-metrics-tls podName:61e1abe3-3b21-40a0-9ec5-c4dbf542029e nodeName:}" failed. No retries permitted until 2026-04-16 20:12:45.812809955 +0000 UTC m=+53.474407425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-5rbs2" (UID: "61e1abe3-3b21-40a0-9ec5-c4dbf542029e") : secret "openshift-state-metrics-tls" not found Apr 16 20:12:45.313055 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312892 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f649df33-f768-4559-8efb-4679bd198e57-root\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.313055 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312932 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f649df33-f768-4559-8efb-4679bd198e57-sys\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.313055 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.312960 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5rbs2\" (UID: \"61e1abe3-3b21-40a0-9ec5-c4dbf542029e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.313055 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.313009 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.313055 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.313038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-tls\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.313433 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.313068 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.313433 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.313101 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5rbs2\" (UID: \"61e1abe3-3b21-40a0-9ec5-c4dbf542029e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.313433 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.313125 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-textfile\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.313719 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.313694 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5rbs2\" (UID: \"61e1abe3-3b21-40a0-9ec5-c4dbf542029e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.316351 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.316330 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5rbs2\" (UID: \"61e1abe3-3b21-40a0-9ec5-c4dbf542029e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.333047 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.333018 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjkcg\" (UniqueName: \"kubernetes.io/projected/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-kube-api-access-vjkcg\") pod \"openshift-state-metrics-9d44df66c-5rbs2\" (UID: \"61e1abe3-3b21-40a0-9ec5-c4dbf542029e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.414440 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414403 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.414601 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414469 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f649df33-f768-4559-8efb-4679bd198e57-metrics-client-ca\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.414601 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414522 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.414601 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414556 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6wws\" (UniqueName: \"kubernetes.io/projected/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-api-access-v6wws\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.414601 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414580 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-wtmp\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.414750 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414611 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.414750 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414641 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.414750 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414679 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-accelerators-collector-config\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.414750 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414706 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmb4\" (UniqueName: \"kubernetes.io/projected/f649df33-f768-4559-8efb-4679bd198e57-kube-api-access-ldmb4\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.414750 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414730 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f649df33-f768-4559-8efb-4679bd198e57-root\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.414982 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414757 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f649df33-f768-4559-8efb-4679bd198e57-sys\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.414982 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414810 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.414982 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414844 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.414982 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414855 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f649df33-f768-4559-8efb-4679bd198e57-root\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.414982 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414890 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-tls\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.414982 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414929 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.414982 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414962 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-textfile\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.415373 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.414983 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-wtmp\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.415373 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.415033 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f649df33-f768-4559-8efb-4679bd198e57-sys\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.415373 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:45.415178 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 20:12:45.415373 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:45.415272 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-tls podName:f649df33-f768-4559-8efb-4679bd198e57 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:45.915251605 +0000 UTC m=+53.576849119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-tls") pod "node-exporter-ftphk" (UID: "f649df33-f768-4559-8efb-4679bd198e57") : secret "node-exporter-tls" not found Apr 16 20:12:45.415607 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:45.415510 2569 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 20:12:45.415607 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:45.415553 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-state-metrics-tls podName:dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:45.915540149 +0000 UTC m=+53.577137622 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-nbdcc" (UID: "dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1") : secret "kube-state-metrics-tls" not found Apr 16 20:12:45.416410 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.416374 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-textfile\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.416603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.416560 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f649df33-f768-4559-8efb-4679bd198e57-metrics-client-ca\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.416714 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.416674 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-accelerators-collector-config\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.417372 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.417309 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.417372 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.417322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.417598 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.417486 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.417729 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.417708 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.424978 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.424938 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6wws\" (UniqueName: \"kubernetes.io/projected/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-api-access-v6wws\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.425720 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.425262 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldmb4\" (UniqueName: \"kubernetes.io/projected/f649df33-f768-4559-8efb-4679bd198e57-kube-api-access-ldmb4\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.818733 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.818676 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5rbs2\" (UID: \"61e1abe3-3b21-40a0-9ec5-c4dbf542029e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.821467 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.821442 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/61e1abe3-3b21-40a0-9ec5-c4dbf542029e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5rbs2\" (UID: \"61e1abe3-3b21-40a0-9ec5-c4dbf542029e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:45.919095 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.919053 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-tls\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.919279 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.919254 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:45.921275 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.921252 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f649df33-f768-4559-8efb-4679bd198e57-node-exporter-tls\") pod \"node-exporter-ftphk\" (UID: \"f649df33-f768-4559-8efb-4679bd198e57\") " pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:45.921400 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:45.921381 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-nbdcc\" (UID: \"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:46.113752 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.113676 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" Apr 16 20:12:46.125764 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.125605 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" Apr 16 20:12:46.146553 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.146529 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ftphk" Apr 16 20:12:46.156061 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:12:46.156014 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf649df33_f768_4559_8efb_4679bd198e57.slice/crio-389670c802a7df6db5369c42399f1c380d8b71237995c1d21e4fe0638e4b8d00 WatchSource:0}: Error finding container 389670c802a7df6db5369c42399f1c380d8b71237995c1d21e4fe0638e4b8d00: Status 404 returned error can't find the container with id 389670c802a7df6db5369c42399f1c380d8b71237995c1d21e4fe0638e4b8d00 Apr 16 20:12:46.182371 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.182047 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ftphk" event={"ID":"f649df33-f768-4559-8efb-4679bd198e57","Type":"ContainerStarted","Data":"389670c802a7df6db5369c42399f1c380d8b71237995c1d21e4fe0638e4b8d00"} Apr 16 20:12:46.251185 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.251141 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2"] Apr 16 20:12:46.257821 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.257801 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-nbdcc"] Apr 16 20:12:46.261122 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.261102 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:12:46.273130 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.273108 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.275546 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.275521 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 20:12:46.275546 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.275521 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 20:12:46.275694 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.275600 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 20:12:46.275694 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.275526 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 20:12:46.275694 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.275661 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 20:12:46.275813 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.275735 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 20:12:46.275813 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.275761 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 20:12:46.275813 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.275796 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 20:12:46.275925 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.275868 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5g2qj\"" Apr 16 20:12:46.275977 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.275962 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 20:12:46.280584 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.280564 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:12:46.282326 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:12:46.282289 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61e1abe3_3b21_40a0_9ec5_c4dbf542029e.slice/crio-29ba00b8e5310835610b6c9d950048ac5e068e590d3c97c2a22e568695b2ecfc WatchSource:0}: Error finding container 29ba00b8e5310835610b6c9d950048ac5e068e590d3c97c2a22e568695b2ecfc: Status 404 returned error can't find the container with id 29ba00b8e5310835610b6c9d950048ac5e068e590d3c97c2a22e568695b2ecfc Apr 16 20:12:46.283146 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:12:46.283085 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfff4d3b_2ab4_4a3e_a829_fc1bae4421d1.slice/crio-4e879085827a099cb911b3141c09825848c9239a1ef644f9a07773c03c038441 WatchSource:0}: Error finding container 4e879085827a099cb911b3141c09825848c9239a1ef644f9a07773c03c038441: Status 404 returned error can't find the container with id 4e879085827a099cb911b3141c09825848c9239a1ef644f9a07773c03c038441 Apr 16 20:12:46.323986 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.323934 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.323986 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.323971 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-web-config\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.324108 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.324027 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvmx7\" (UniqueName: \"kubernetes.io/projected/de260e29-0694-4f3f-ba1e-5010b43cd219-kube-api-access-tvmx7\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.324178 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.324161 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.324258 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.324196 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.324258 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.324248 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.324362 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.324276 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de260e29-0694-4f3f-ba1e-5010b43cd219-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.324362 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.324308 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de260e29-0694-4f3f-ba1e-5010b43cd219-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.324362 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.324338 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.324514 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.324379 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de260e29-0694-4f3f-ba1e-5010b43cd219-tls-assets\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.324514 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.324408 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-config-volume\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.324514 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.324463 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de260e29-0694-4f3f-ba1e-5010b43cd219-config-out\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.324514 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.324490 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/de260e29-0694-4f3f-ba1e-5010b43cd219-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.424958 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.424927 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de260e29-0694-4f3f-ba1e-5010b43cd219-tls-assets\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.425078 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.424967 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-config-volume\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.425078 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.425001 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de260e29-0694-4f3f-ba1e-5010b43cd219-config-out\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.425078 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.425017 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/de260e29-0694-4f3f-ba1e-5010b43cd219-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.425078 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.425035 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.425078 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.425049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-web-config\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.425078 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.425067 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvmx7\" (UniqueName: \"kubernetes.io/projected/de260e29-0694-4f3f-ba1e-5010b43cd219-kube-api-access-tvmx7\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.425447 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.425128 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.425447 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:46.425230 2569 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 20:12:46.425447 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:12:46.425285 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-main-tls podName:de260e29-0694-4f3f-ba1e-5010b43cd219 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:46.925265686 +0000 UTC m=+54.586863153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219") : secret "alertmanager-main-tls" not found Apr 16 20:12:46.425969 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.425891 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.426341 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.426052 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/de260e29-0694-4f3f-ba1e-5010b43cd219-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.426490 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.425944 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.426649 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.426606 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de260e29-0694-4f3f-ba1e-5010b43cd219-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.426893 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.426780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de260e29-0694-4f3f-ba1e-5010b43cd219-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.426893 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.426844 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.428957 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.428923 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de260e29-0694-4f3f-ba1e-5010b43cd219-tls-assets\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.430130 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.429675 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-web-config\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.430130 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.429694 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.430130 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.430052 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.431107 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.430649 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de260e29-0694-4f3f-ba1e-5010b43cd219-config-out\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.431107 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.431048 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de260e29-0694-4f3f-ba1e-5010b43cd219-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.431295 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.431191 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.431446 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.431423 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de260e29-0694-4f3f-ba1e-5010b43cd219-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.431876 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.431849 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-config-volume\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.432489 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.432457 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.433870 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.433837 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvmx7\" (UniqueName: \"kubernetes.io/projected/de260e29-0694-4f3f-ba1e-5010b43cd219-kube-api-access-tvmx7\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.931560 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.931510 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:46.934896 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:46.934869 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:47.157412 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:47.157378 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zf7nk" Apr 16 20:12:47.184818 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:47.184735 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:12:47.186974 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:47.186943 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c869b96f4-s2lgn" event={"ID":"d103b181-6102-40b7-8f86-fc8adc683c01","Type":"ContainerStarted","Data":"52fc445f04ba670d71c2e86042e701aed115e21944410e3159d5d834dc461898"} Apr 16 20:12:47.188228 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:47.188193 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" event={"ID":"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1","Type":"ContainerStarted","Data":"4e879085827a099cb911b3141c09825848c9239a1ef644f9a07773c03c038441"} Apr 16 20:12:47.189958 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:47.189932 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" event={"ID":"61e1abe3-3b21-40a0-9ec5-c4dbf542029e","Type":"ContainerStarted","Data":"149f7ead8dfe9a1f7186580957c98adc254adbe6ad442ebb08391939816b7389"} Apr 16 20:12:47.190059 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:47.189967 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" event={"ID":"61e1abe3-3b21-40a0-9ec5-c4dbf542029e","Type":"ContainerStarted","Data":"4abe65a2731b57486c3b9720f17fa24950f336ec1d33b34796bf7d31f45dd5a1"} Apr 16 20:12:47.190059 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:47.189980 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" event={"ID":"61e1abe3-3b21-40a0-9ec5-c4dbf542029e","Type":"ContainerStarted","Data":"29ba00b8e5310835610b6c9d950048ac5e068e590d3c97c2a22e568695b2ecfc"} Apr 16 20:12:47.383242 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:47.383031 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c869b96f4-s2lgn" podStartSLOduration=2.517587005 podStartE2EDuration="5.383013454s" podCreationTimestamp="2026-04-16 20:12:42 +0000 UTC" firstStartedPulling="2026-04-16 20:12:43.436005572 +0000 UTC m=+51.097603039" lastFinishedPulling="2026-04-16 20:12:46.301432006 +0000 UTC m=+53.963029488" observedRunningTime="2026-04-16 20:12:47.233863465 +0000 UTC m=+54.895460954" watchObservedRunningTime="2026-04-16 20:12:47.383013454 +0000 UTC m=+55.044610945" Apr 16 20:12:47.383793 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:47.383750 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:12:47.660301 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:12:47.660264 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde260e29_0694_4f3f_ba1e_5010b43cd219.slice/crio-d2679d5bac06a990277ad2e9c23bb2d89e71ce6c231a63a6e35a97d261f13286 WatchSource:0}: Error finding container d2679d5bac06a990277ad2e9c23bb2d89e71ce6c231a63a6e35a97d261f13286: Status 404 returned error can't find the container with id d2679d5bac06a990277ad2e9c23bb2d89e71ce6c231a63a6e35a97d261f13286 Apr 16 20:12:48.193961 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:48.193921 2569 generic.go:358] "Generic (PLEG): container finished" podID="f649df33-f768-4559-8efb-4679bd198e57" containerID="b4bab19e86a560d0e4d8255e7814d5508fcb30e40ad0d04ba0080cf8f3fd57fc" exitCode=0 Apr 16 20:12:48.194226 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:48.193995 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ftphk" event={"ID":"f649df33-f768-4559-8efb-4679bd198e57","Type":"ContainerDied","Data":"b4bab19e86a560d0e4d8255e7814d5508fcb30e40ad0d04ba0080cf8f3fd57fc"} Apr 16 20:12:48.195096 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:48.195075 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerStarted","Data":"d2679d5bac06a990277ad2e9c23bb2d89e71ce6c231a63a6e35a97d261f13286"} Apr 16 20:12:49.200353 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:49.200309 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ftphk" event={"ID":"f649df33-f768-4559-8efb-4679bd198e57","Type":"ContainerStarted","Data":"ffa243524d65583cbeb0f36b14719148d1ae8d783e3b86cfc955360236faeb0e"} Apr 16 20:12:49.200774 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:49.200358 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ftphk" event={"ID":"f649df33-f768-4559-8efb-4679bd198e57","Type":"ContainerStarted","Data":"6676c4ebf13b6aec679693660c3cb1e449a1a27fdc68bd4fbb82e52ec2ef3cac"} Apr 16 20:12:49.202156 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:49.202129 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" event={"ID":"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1","Type":"ContainerStarted","Data":"05692697da98f87967144d43a5ca36563b845f5cf21504444e6e0aa11120cd81"} Apr 16 20:12:49.202300 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:49.202163 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" event={"ID":"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1","Type":"ContainerStarted","Data":"66bdaa1f6f49e0df02510391644112faf6f441066d66ccef7d06e2c49290dedf"} Apr 16 20:12:49.202300 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:49.202176 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" event={"ID":"dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1","Type":"ContainerStarted","Data":"f0c65776295f231f209cff7ae644d66f3612529b426f0b3403399cb77f4fba03"} Apr 16 20:12:49.203827 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:49.203807 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" event={"ID":"61e1abe3-3b21-40a0-9ec5-c4dbf542029e","Type":"ContainerStarted","Data":"c0e578138380d5177d8d456a505c066147772f452f5dca160a116d40d074b6b1"} Apr 16 20:12:49.219596 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:49.219559 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ftphk" podStartSLOduration=3.148690066 podStartE2EDuration="4.21954843s" podCreationTimestamp="2026-04-16 20:12:45 +0000 UTC" firstStartedPulling="2026-04-16 20:12:46.157954561 +0000 UTC m=+53.819552048" lastFinishedPulling="2026-04-16 20:12:47.22881293 +0000 UTC m=+54.890410412" observedRunningTime="2026-04-16 20:12:49.218352953 +0000 UTC m=+56.879950442" watchObservedRunningTime="2026-04-16 20:12:49.21954843 +0000 UTC m=+56.881145898" Apr 16 20:12:49.236025 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:49.235980 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-nbdcc" podStartSLOduration=2.291717767 podStartE2EDuration="4.235970795s" podCreationTimestamp="2026-04-16 20:12:45 +0000 UTC" firstStartedPulling="2026-04-16 20:12:46.293611586 +0000 UTC m=+53.955209059" lastFinishedPulling="2026-04-16 20:12:48.237864606 +0000 UTC m=+55.899462087" observedRunningTime="2026-04-16 20:12:49.23517801 +0000 UTC m=+56.896775499" watchObservedRunningTime="2026-04-16 20:12:49.235970795 +0000 UTC m=+56.897568296" Apr 16 20:12:49.250876 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:49.250834 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5rbs2" podStartSLOduration=2.540229446 podStartE2EDuration="4.250821806s" podCreationTimestamp="2026-04-16 20:12:45 +0000 UTC" firstStartedPulling="2026-04-16 20:12:46.529081594 +0000 UTC m=+54.190679061" lastFinishedPulling="2026-04-16 20:12:48.239673949 +0000 UTC m=+55.901271421" observedRunningTime="2026-04-16 20:12:49.250371394 +0000 UTC m=+56.911968884" watchObservedRunningTime="2026-04-16 20:12:49.250821806 +0000 UTC m=+56.912419293" Apr 16 20:12:50.207622 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:50.207587 2569 generic.go:358] "Generic (PLEG): container finished" podID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerID="576bdc962b11274b915385bb8636298fa7ee3a2f9a9c6fe15cfcd1b375fc9686" exitCode=0 Apr 16 20:12:50.208038 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:50.207696 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerDied","Data":"576bdc962b11274b915385bb8636298fa7ee3a2f9a9c6fe15cfcd1b375fc9686"} Apr 16 20:12:50.616497 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:50.616458 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c869b96f4-s2lgn"] Apr 16 20:12:52.103174 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:52.103151 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kmsm8" Apr 16 20:12:52.215195 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:52.215150 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerStarted","Data":"4ffb40ffdbb598bacb2a397df810bc33e49b70d27ed41f2d8269573237f993e4"} Apr 16 20:12:52.215195 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:52.215190 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerStarted","Data":"055f16775884f84ba7b60ebf7c9ae96d0ebb1b17f72c0fd14752e42f968fe558"} Apr 16 20:12:52.215195 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:52.215201 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerStarted","Data":"723d5706980a91cd7b33add32de17541d4c1d1cf3ce78d1f38913898acd3a782"} Apr 16 20:12:53.220532 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:53.220447 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerStarted","Data":"3d4bc9f3a4bf4f53f5ca8dd3809052fb086ef9ed95bb8f6de70ed0745e2c9914"} Apr 16 20:12:53.220532 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:53.220482 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerStarted","Data":"8cec960d933102d4ead1fc4ac4a370514b18b9c7572e0e58c5df6e3213828045"} Apr 16 20:12:53.220532 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:53.220493 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerStarted","Data":"f42d9904d644e971b09d955bfd80139350c14570a9b76f62fae991cb3f5793aa"} Apr 16 20:12:53.253791 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:53.253725 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.029653355 podStartE2EDuration="7.25371116s" podCreationTimestamp="2026-04-16 20:12:46 +0000 UTC" firstStartedPulling="2026-04-16 20:12:47.662172864 +0000 UTC m=+55.323770332" lastFinishedPulling="2026-04-16 20:12:52.886230648 +0000 UTC m=+60.547828137" observedRunningTime="2026-04-16 20:12:53.25251555 +0000 UTC m=+60.914113039" watchObservedRunningTime="2026-04-16 20:12:53.25371116 +0000 UTC m=+60.915308648" Apr 16 20:12:53.310590 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:53.310559 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:12:54.140164 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:54.140138 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-658f999b89-l7k42" Apr 16 20:12:58.631637 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:58.631596 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs\") pod \"network-metrics-daemon-ck2ww\" (UID: \"f2865cec-958e-49f5-9bd1-57d8fbb3fefc\") " pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:58.633940 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:58.633919 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:12:58.644833 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:58.644801 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f2865cec-958e-49f5-9bd1-57d8fbb3fefc-metrics-certs\") pod \"network-metrics-daemon-ck2ww\" (UID: \"f2865cec-958e-49f5-9bd1-57d8fbb3fefc\") " pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:58.661622 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:58.661595 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-b2xrn\"" Apr 16 20:12:58.669621 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:58.669606 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ck2ww" Apr 16 20:12:58.732054 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:58.732023 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbc2l\" (UniqueName: \"kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l\") pod \"network-check-target-bg95d\" (UID: \"4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f\") " pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:58.735003 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:58.734968 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:12:58.746435 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:58.746410 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:12:58.755053 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:58.755031 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbc2l\" (UniqueName: \"kubernetes.io/projected/4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f-kube-api-access-qbc2l\") pod \"network-check-target-bg95d\" (UID: \"4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f\") " pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:58.786349 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:58.786321 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ck2ww"] Apr 16 20:12:58.956745 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:58.956663 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jm2hn\"" Apr 16 20:12:58.965584 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:58.965566 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:12:59.091303 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:59.091283 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bg95d"] Apr 16 20:12:59.093665 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:12:59.093639 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ec7ab1f_5d4d_4ae6_9ba5_216bc7dd364f.slice/crio-ffc8df772d74b82de5b97d2903f0a316470fd3055e588f7317a882605f382971 WatchSource:0}: Error finding container ffc8df772d74b82de5b97d2903f0a316470fd3055e588f7317a882605f382971: Status 404 returned error can't find the container with id ffc8df772d74b82de5b97d2903f0a316470fd3055e588f7317a882605f382971 Apr 16 20:12:59.241595 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:59.241553 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bg95d" event={"ID":"4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f","Type":"ContainerStarted","Data":"ffc8df772d74b82de5b97d2903f0a316470fd3055e588f7317a882605f382971"} Apr 16 20:12:59.242480 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:12:59.242459 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ck2ww" event={"ID":"f2865cec-958e-49f5-9bd1-57d8fbb3fefc","Type":"ContainerStarted","Data":"0c5bc1bec84bac0bc86986a620f8db213e56a8ad6a4a23a719211cd8eeadee6f"} Apr 16 20:13:01.250793 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:01.250751 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ck2ww" event={"ID":"f2865cec-958e-49f5-9bd1-57d8fbb3fefc","Type":"ContainerStarted","Data":"fc800e0e813c0977d659876487ec1c25fbe98a46996e7992b675c11659c712b1"} Apr 16 20:13:01.250793 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:01.250795 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ck2ww" event={"ID":"f2865cec-958e-49f5-9bd1-57d8fbb3fefc","Type":"ContainerStarted","Data":"5ff4b2741b0dbfaa0e65960deabc18c0917b072e0c8d8af766cffb624a75d36a"} Apr 16 20:13:01.267055 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:01.266982 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ck2ww" podStartSLOduration=66.820962675 podStartE2EDuration="1m8.266962496s" podCreationTimestamp="2026-04-16 20:11:53 +0000 UTC" firstStartedPulling="2026-04-16 20:12:58.791691817 +0000 UTC m=+66.453289286" lastFinishedPulling="2026-04-16 20:13:00.237691624 +0000 UTC m=+67.899289107" observedRunningTime="2026-04-16 20:13:01.265951115 +0000 UTC m=+68.927548605" watchObservedRunningTime="2026-04-16 20:13:01.266962496 +0000 UTC m=+68.928559984" Apr 16 20:13:02.254801 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:02.254762 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bg95d" event={"ID":"4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f","Type":"ContainerStarted","Data":"a4afac346a1f4c0e7bf5f2472de9bca1f2a0bc55f20d1d52054a62a0bfb36d35"} Apr 16 20:13:02.273662 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:02.273611 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bg95d" podStartSLOduration=66.490698433 podStartE2EDuration="1m9.273594089s" podCreationTimestamp="2026-04-16 20:11:53 +0000 UTC" firstStartedPulling="2026-04-16 20:12:59.095650592 +0000 UTC m=+66.757248059" lastFinishedPulling="2026-04-16 20:13:01.878546233 +0000 UTC m=+69.540143715" observedRunningTime="2026-04-16 20:13:02.272909048 +0000 UTC m=+69.934506547" watchObservedRunningTime="2026-04-16 20:13:02.273594089 +0000 UTC m=+69.935191578" Apr 16 20:13:03.257926 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:03.257892 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:13:15.635470 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.635424 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c869b96f4-s2lgn" podUID="d103b181-6102-40b7-8f86-fc8adc683c01" containerName="console" containerID="cri-o://52fc445f04ba670d71c2e86042e701aed115e21944410e3159d5d834dc461898" gracePeriod=15 Apr 16 20:13:15.867884 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.867860 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c869b96f4-s2lgn_d103b181-6102-40b7-8f86-fc8adc683c01/console/0.log" Apr 16 20:13:15.868004 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.867946 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:13:15.986407 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.986374 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-trusted-ca-bundle\") pod \"d103b181-6102-40b7-8f86-fc8adc683c01\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " Apr 16 20:13:15.986590 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.986496 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-service-ca\") pod \"d103b181-6102-40b7-8f86-fc8adc683c01\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " Apr 16 20:13:15.986590 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.986536 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d103b181-6102-40b7-8f86-fc8adc683c01-console-oauth-config\") pod \"d103b181-6102-40b7-8f86-fc8adc683c01\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " Apr 16 20:13:15.986590 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.986573 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-console-config\") pod \"d103b181-6102-40b7-8f86-fc8adc683c01\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " Apr 16 20:13:15.986749 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.986649 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-oauth-serving-cert\") pod \"d103b181-6102-40b7-8f86-fc8adc683c01\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " Apr 16 20:13:15.986749 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.986692 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn7qn\" (UniqueName: \"kubernetes.io/projected/d103b181-6102-40b7-8f86-fc8adc683c01-kube-api-access-zn7qn\") pod \"d103b181-6102-40b7-8f86-fc8adc683c01\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " Apr 16 20:13:15.986749 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.986706 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d103b181-6102-40b7-8f86-fc8adc683c01" (UID: "d103b181-6102-40b7-8f86-fc8adc683c01"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:13:15.986749 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.986718 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d103b181-6102-40b7-8f86-fc8adc683c01-console-serving-cert\") pod \"d103b181-6102-40b7-8f86-fc8adc683c01\" (UID: \"d103b181-6102-40b7-8f86-fc8adc683c01\") " Apr 16 20:13:15.987368 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.987198 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-service-ca" (OuterVolumeSpecName: "service-ca") pod "d103b181-6102-40b7-8f86-fc8adc683c01" (UID: "d103b181-6102-40b7-8f86-fc8adc683c01"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:13:15.987497 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.987430 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d103b181-6102-40b7-8f86-fc8adc683c01" (UID: "d103b181-6102-40b7-8f86-fc8adc683c01"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:13:15.987497 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.987453 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-trusted-ca-bundle\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:13:15.987497 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.987462 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-console-config" (OuterVolumeSpecName: "console-config") pod "d103b181-6102-40b7-8f86-fc8adc683c01" (UID: "d103b181-6102-40b7-8f86-fc8adc683c01"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:13:15.993533 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.993138 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d103b181-6102-40b7-8f86-fc8adc683c01-kube-api-access-zn7qn" (OuterVolumeSpecName: "kube-api-access-zn7qn") pod "d103b181-6102-40b7-8f86-fc8adc683c01" (UID: "d103b181-6102-40b7-8f86-fc8adc683c01"). InnerVolumeSpecName "kube-api-access-zn7qn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:13:15.993533 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.993148 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d103b181-6102-40b7-8f86-fc8adc683c01-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d103b181-6102-40b7-8f86-fc8adc683c01" (UID: "d103b181-6102-40b7-8f86-fc8adc683c01"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:13:15.994507 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:15.994421 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d103b181-6102-40b7-8f86-fc8adc683c01-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d103b181-6102-40b7-8f86-fc8adc683c01" (UID: "d103b181-6102-40b7-8f86-fc8adc683c01"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:13:16.088424 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.088388 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-oauth-serving-cert\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:13:16.088424 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.088417 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zn7qn\" (UniqueName: \"kubernetes.io/projected/d103b181-6102-40b7-8f86-fc8adc683c01-kube-api-access-zn7qn\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:13:16.088605 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.088428 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d103b181-6102-40b7-8f86-fc8adc683c01-console-serving-cert\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:13:16.088605 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.088460 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-service-ca\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:13:16.088605 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.088469 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d103b181-6102-40b7-8f86-fc8adc683c01-console-oauth-config\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:13:16.088605 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.088478 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d103b181-6102-40b7-8f86-fc8adc683c01-console-config\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:13:16.296021 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.295946 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c869b96f4-s2lgn_d103b181-6102-40b7-8f86-fc8adc683c01/console/0.log" Apr 16 20:13:16.296021 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.295985 2569 generic.go:358] "Generic (PLEG): container finished" podID="d103b181-6102-40b7-8f86-fc8adc683c01" containerID="52fc445f04ba670d71c2e86042e701aed115e21944410e3159d5d834dc461898" exitCode=2 Apr 16 20:13:16.296201 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.296022 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c869b96f4-s2lgn" event={"ID":"d103b181-6102-40b7-8f86-fc8adc683c01","Type":"ContainerDied","Data":"52fc445f04ba670d71c2e86042e701aed115e21944410e3159d5d834dc461898"} Apr 16 20:13:16.296201 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.296061 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c869b96f4-s2lgn" event={"ID":"d103b181-6102-40b7-8f86-fc8adc683c01","Type":"ContainerDied","Data":"f925d4b3021ea055c8c617f00e65307b1d73e3a9c82aee0c0603d2cbc252e1e4"} Apr 16 20:13:16.296201 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.296078 2569 scope.go:117] "RemoveContainer" containerID="52fc445f04ba670d71c2e86042e701aed115e21944410e3159d5d834dc461898" Apr 16 20:13:16.296201 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.296077 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c869b96f4-s2lgn" Apr 16 20:13:16.304195 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.304178 2569 scope.go:117] "RemoveContainer" containerID="52fc445f04ba670d71c2e86042e701aed115e21944410e3159d5d834dc461898" Apr 16 20:13:16.304483 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:13:16.304461 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52fc445f04ba670d71c2e86042e701aed115e21944410e3159d5d834dc461898\": container with ID starting with 52fc445f04ba670d71c2e86042e701aed115e21944410e3159d5d834dc461898 not found: ID does not exist" containerID="52fc445f04ba670d71c2e86042e701aed115e21944410e3159d5d834dc461898" Apr 16 20:13:16.304547 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.304491 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52fc445f04ba670d71c2e86042e701aed115e21944410e3159d5d834dc461898"} err="failed to get container status \"52fc445f04ba670d71c2e86042e701aed115e21944410e3159d5d834dc461898\": rpc error: code = NotFound desc = could not find container \"52fc445f04ba670d71c2e86042e701aed115e21944410e3159d5d834dc461898\": container with ID starting with 52fc445f04ba670d71c2e86042e701aed115e21944410e3159d5d834dc461898 not found: ID does not exist" Apr 16 20:13:16.316781 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.316755 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c869b96f4-s2lgn"] Apr 16 20:13:16.319262 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.319240 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c869b96f4-s2lgn"] Apr 16 20:13:16.949275 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:16.949238 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d103b181-6102-40b7-8f86-fc8adc683c01" path="/var/lib/kubelet/pods/d103b181-6102-40b7-8f86-fc8adc683c01/volumes" Apr 16 20:13:34.266916 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:13:34.266885 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bg95d" Apr 16 20:14:05.798679 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:05.798641 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:14:05.799199 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:05.799056 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="alertmanager" containerID="cri-o://723d5706980a91cd7b33add32de17541d4c1d1cf3ce78d1f38913898acd3a782" gracePeriod=120 Apr 16 20:14:05.799199 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:05.799110 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="kube-rbac-proxy-metric" containerID="cri-o://8cec960d933102d4ead1fc4ac4a370514b18b9c7572e0e58c5df6e3213828045" gracePeriod=120 Apr 16 20:14:05.799199 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:05.799151 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="config-reloader" containerID="cri-o://055f16775884f84ba7b60ebf7c9ae96d0ebb1b17f72c0fd14752e42f968fe558" gracePeriod=120 Apr 16 20:14:05.799199 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:05.799188 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="kube-rbac-proxy" containerID="cri-o://f42d9904d644e971b09d955bfd80139350c14570a9b76f62fae991cb3f5793aa" gracePeriod=120 Apr 16 20:14:05.799199 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:05.799195 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="prom-label-proxy" containerID="cri-o://3d4bc9f3a4bf4f53f5ca8dd3809052fb086ef9ed95bb8f6de70ed0745e2c9914" gracePeriod=120 Apr 16 20:14:05.799523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:05.799139 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="kube-rbac-proxy-web" containerID="cri-o://4ffb40ffdbb598bacb2a397df810bc33e49b70d27ed41f2d8269573237f993e4" gracePeriod=120 Apr 16 20:14:06.437274 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:06.437234 2569 generic.go:358] "Generic (PLEG): container finished" podID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerID="3d4bc9f3a4bf4f53f5ca8dd3809052fb086ef9ed95bb8f6de70ed0745e2c9914" exitCode=0 Apr 16 20:14:06.437274 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:06.437268 2569 generic.go:358] "Generic (PLEG): container finished" podID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerID="8cec960d933102d4ead1fc4ac4a370514b18b9c7572e0e58c5df6e3213828045" exitCode=0 Apr 16 20:14:06.437274 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:06.437275 2569 generic.go:358] "Generic (PLEG): container finished" podID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerID="f42d9904d644e971b09d955bfd80139350c14570a9b76f62fae991cb3f5793aa" exitCode=0 Apr 16 20:14:06.437274 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:06.437282 2569 generic.go:358] "Generic (PLEG): container finished" podID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerID="055f16775884f84ba7b60ebf7c9ae96d0ebb1b17f72c0fd14752e42f968fe558" exitCode=0 Apr 16 20:14:06.437544 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:06.437287 2569 generic.go:358] "Generic (PLEG): container finished" podID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerID="723d5706980a91cd7b33add32de17541d4c1d1cf3ce78d1f38913898acd3a782" exitCode=0 Apr 16 20:14:06.437544 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:06.437243 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerDied","Data":"3d4bc9f3a4bf4f53f5ca8dd3809052fb086ef9ed95bb8f6de70ed0745e2c9914"} Apr 16 20:14:06.437544 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:06.437357 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerDied","Data":"8cec960d933102d4ead1fc4ac4a370514b18b9c7572e0e58c5df6e3213828045"} Apr 16 20:14:06.437544 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:06.437368 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerDied","Data":"f42d9904d644e971b09d955bfd80139350c14570a9b76f62fae991cb3f5793aa"} Apr 16 20:14:06.437544 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:06.437377 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerDied","Data":"055f16775884f84ba7b60ebf7c9ae96d0ebb1b17f72c0fd14752e42f968fe558"} Apr 16 20:14:06.437544 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:06.437385 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerDied","Data":"723d5706980a91cd7b33add32de17541d4c1d1cf3ce78d1f38913898acd3a782"} Apr 16 20:14:07.031681 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.031657 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.137234 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137182 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de260e29-0694-4f3f-ba1e-5010b43cd219-tls-assets\") pod \"de260e29-0694-4f3f-ba1e-5010b43cd219\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " Apr 16 20:14:07.137407 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137249 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-cluster-tls-config\") pod \"de260e29-0694-4f3f-ba1e-5010b43cd219\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " Apr 16 20:14:07.137407 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137289 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy-metric\") pod \"de260e29-0694-4f3f-ba1e-5010b43cd219\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " Apr 16 20:14:07.137407 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137313 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-config-volume\") pod \"de260e29-0694-4f3f-ba1e-5010b43cd219\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " Apr 16 20:14:07.137407 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137348 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de260e29-0694-4f3f-ba1e-5010b43cd219-alertmanager-trusted-ca-bundle\") pod \"de260e29-0694-4f3f-ba1e-5010b43cd219\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " Apr 16 20:14:07.137407 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137380 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/de260e29-0694-4f3f-ba1e-5010b43cd219-alertmanager-main-db\") pod \"de260e29-0694-4f3f-ba1e-5010b43cd219\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " Apr 16 20:14:07.137668 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137426 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy\") pod \"de260e29-0694-4f3f-ba1e-5010b43cd219\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " Apr 16 20:14:07.137820 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137748 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de260e29-0694-4f3f-ba1e-5010b43cd219-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "de260e29-0694-4f3f-ba1e-5010b43cd219" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:07.137820 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137798 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy-web\") pod \"de260e29-0694-4f3f-ba1e-5010b43cd219\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " Apr 16 20:14:07.137820 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137795 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de260e29-0694-4f3f-ba1e-5010b43cd219-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "de260e29-0694-4f3f-ba1e-5010b43cd219" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:07.138063 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137846 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-main-tls\") pod \"de260e29-0694-4f3f-ba1e-5010b43cd219\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " Apr 16 20:14:07.138063 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137882 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvmx7\" (UniqueName: \"kubernetes.io/projected/de260e29-0694-4f3f-ba1e-5010b43cd219-kube-api-access-tvmx7\") pod \"de260e29-0694-4f3f-ba1e-5010b43cd219\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " Apr 16 20:14:07.138063 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137916 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-web-config\") pod \"de260e29-0694-4f3f-ba1e-5010b43cd219\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " Apr 16 20:14:07.138063 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137947 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de260e29-0694-4f3f-ba1e-5010b43cd219-metrics-client-ca\") pod \"de260e29-0694-4f3f-ba1e-5010b43cd219\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " Apr 16 20:14:07.138063 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.137975 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de260e29-0694-4f3f-ba1e-5010b43cd219-config-out\") pod \"de260e29-0694-4f3f-ba1e-5010b43cd219\" (UID: \"de260e29-0694-4f3f-ba1e-5010b43cd219\") " Apr 16 20:14:07.138349 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.138176 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de260e29-0694-4f3f-ba1e-5010b43cd219-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:14:07.138349 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.138194 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/de260e29-0694-4f3f-ba1e-5010b43cd219-alertmanager-main-db\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:14:07.139349 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.139274 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de260e29-0694-4f3f-ba1e-5010b43cd219-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "de260e29-0694-4f3f-ba1e-5010b43cd219" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:07.140188 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.140046 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-config-volume" (OuterVolumeSpecName: "config-volume") pod "de260e29-0694-4f3f-ba1e-5010b43cd219" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:07.140188 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.140161 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de260e29-0694-4f3f-ba1e-5010b43cd219-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "de260e29-0694-4f3f-ba1e-5010b43cd219" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:07.140346 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.140290 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "de260e29-0694-4f3f-ba1e-5010b43cd219" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:07.140388 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.140349 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "de260e29-0694-4f3f-ba1e-5010b43cd219" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:07.140835 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.140806 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "de260e29-0694-4f3f-ba1e-5010b43cd219" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:07.141141 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.141114 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de260e29-0694-4f3f-ba1e-5010b43cd219-config-out" (OuterVolumeSpecName: "config-out") pod "de260e29-0694-4f3f-ba1e-5010b43cd219" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:07.141392 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.141363 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de260e29-0694-4f3f-ba1e-5010b43cd219-kube-api-access-tvmx7" (OuterVolumeSpecName: "kube-api-access-tvmx7") pod "de260e29-0694-4f3f-ba1e-5010b43cd219" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219"). InnerVolumeSpecName "kube-api-access-tvmx7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:07.141740 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.141724 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "de260e29-0694-4f3f-ba1e-5010b43cd219" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:07.144404 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.144380 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "de260e29-0694-4f3f-ba1e-5010b43cd219" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:07.151365 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.151341 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-web-config" (OuterVolumeSpecName: "web-config") pod "de260e29-0694-4f3f-ba1e-5010b43cd219" (UID: "de260e29-0694-4f3f-ba1e-5010b43cd219"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:07.239575 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.239519 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:14:07.239575 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.239564 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:14:07.239575 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.239574 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-main-tls\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:14:07.239575 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.239585 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tvmx7\" (UniqueName: \"kubernetes.io/projected/de260e29-0694-4f3f-ba1e-5010b43cd219-kube-api-access-tvmx7\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:14:07.239575 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.239596 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-web-config\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:14:07.239861 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.239605 2569 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de260e29-0694-4f3f-ba1e-5010b43cd219-metrics-client-ca\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:14:07.239861 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.239614 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de260e29-0694-4f3f-ba1e-5010b43cd219-config-out\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:14:07.239861 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.239622 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de260e29-0694-4f3f-ba1e-5010b43cd219-tls-assets\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:14:07.239861 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.239630 2569 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-cluster-tls-config\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:14:07.239861 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.239639 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:14:07.239861 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.239648 2569 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/de260e29-0694-4f3f-ba1e-5010b43cd219-config-volume\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:14:07.443025 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.442935 2569 generic.go:358] "Generic (PLEG): container finished" podID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerID="4ffb40ffdbb598bacb2a397df810bc33e49b70d27ed41f2d8269573237f993e4" exitCode=0 Apr 16 20:14:07.443194 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.443020 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerDied","Data":"4ffb40ffdbb598bacb2a397df810bc33e49b70d27ed41f2d8269573237f993e4"} Apr 16 20:14:07.443194 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.443049 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.443194 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.443074 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"de260e29-0694-4f3f-ba1e-5010b43cd219","Type":"ContainerDied","Data":"d2679d5bac06a990277ad2e9c23bb2d89e71ce6c231a63a6e35a97d261f13286"} Apr 16 20:14:07.443194 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.443095 2569 scope.go:117] "RemoveContainer" containerID="3d4bc9f3a4bf4f53f5ca8dd3809052fb086ef9ed95bb8f6de70ed0745e2c9914" Apr 16 20:14:07.450183 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.450161 2569 scope.go:117] "RemoveContainer" containerID="8cec960d933102d4ead1fc4ac4a370514b18b9c7572e0e58c5df6e3213828045" Apr 16 20:14:07.456532 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.456515 2569 scope.go:117] "RemoveContainer" containerID="f42d9904d644e971b09d955bfd80139350c14570a9b76f62fae991cb3f5793aa" Apr 16 20:14:07.462426 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.462410 2569 scope.go:117] "RemoveContainer" containerID="4ffb40ffdbb598bacb2a397df810bc33e49b70d27ed41f2d8269573237f993e4" Apr 16 20:14:07.466062 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.465963 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:14:07.469062 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.469017 2569 scope.go:117] "RemoveContainer" containerID="055f16775884f84ba7b60ebf7c9ae96d0ebb1b17f72c0fd14752e42f968fe558" Apr 16 20:14:07.470597 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.470581 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:14:07.475507 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.475487 2569 scope.go:117] "RemoveContainer" containerID="723d5706980a91cd7b33add32de17541d4c1d1cf3ce78d1f38913898acd3a782" Apr 16 20:14:07.483740 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.483724 2569 scope.go:117] "RemoveContainer" containerID="576bdc962b11274b915385bb8636298fa7ee3a2f9a9c6fe15cfcd1b375fc9686" Apr 16 20:14:07.489832 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.489816 2569 scope.go:117] "RemoveContainer" containerID="3d4bc9f3a4bf4f53f5ca8dd3809052fb086ef9ed95bb8f6de70ed0745e2c9914" Apr 16 20:14:07.490066 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:14:07.490048 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d4bc9f3a4bf4f53f5ca8dd3809052fb086ef9ed95bb8f6de70ed0745e2c9914\": container with ID starting with 3d4bc9f3a4bf4f53f5ca8dd3809052fb086ef9ed95bb8f6de70ed0745e2c9914 not found: ID does not exist" containerID="3d4bc9f3a4bf4f53f5ca8dd3809052fb086ef9ed95bb8f6de70ed0745e2c9914" Apr 16 20:14:07.490122 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.490073 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4bc9f3a4bf4f53f5ca8dd3809052fb086ef9ed95bb8f6de70ed0745e2c9914"} err="failed to get container status \"3d4bc9f3a4bf4f53f5ca8dd3809052fb086ef9ed95bb8f6de70ed0745e2c9914\": rpc error: code = NotFound desc = could not find container \"3d4bc9f3a4bf4f53f5ca8dd3809052fb086ef9ed95bb8f6de70ed0745e2c9914\": container with ID starting with 3d4bc9f3a4bf4f53f5ca8dd3809052fb086ef9ed95bb8f6de70ed0745e2c9914 not found: ID does not exist" Apr 16 20:14:07.490122 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.490091 2569 scope.go:117] "RemoveContainer" containerID="8cec960d933102d4ead1fc4ac4a370514b18b9c7572e0e58c5df6e3213828045" Apr 16 20:14:07.490294 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:14:07.490280 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cec960d933102d4ead1fc4ac4a370514b18b9c7572e0e58c5df6e3213828045\": container with ID starting with 8cec960d933102d4ead1fc4ac4a370514b18b9c7572e0e58c5df6e3213828045 not found: ID does not exist" containerID="8cec960d933102d4ead1fc4ac4a370514b18b9c7572e0e58c5df6e3213828045" Apr 16 20:14:07.490339 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.490299 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cec960d933102d4ead1fc4ac4a370514b18b9c7572e0e58c5df6e3213828045"} err="failed to get container status \"8cec960d933102d4ead1fc4ac4a370514b18b9c7572e0e58c5df6e3213828045\": rpc error: code = NotFound desc = could not find container \"8cec960d933102d4ead1fc4ac4a370514b18b9c7572e0e58c5df6e3213828045\": container with ID starting with 8cec960d933102d4ead1fc4ac4a370514b18b9c7572e0e58c5df6e3213828045 not found: ID does not exist" Apr 16 20:14:07.490339 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.490312 2569 scope.go:117] "RemoveContainer" containerID="f42d9904d644e971b09d955bfd80139350c14570a9b76f62fae991cb3f5793aa" Apr 16 20:14:07.490514 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:14:07.490494 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42d9904d644e971b09d955bfd80139350c14570a9b76f62fae991cb3f5793aa\": container with ID starting with f42d9904d644e971b09d955bfd80139350c14570a9b76f62fae991cb3f5793aa not found: ID does not exist" containerID="f42d9904d644e971b09d955bfd80139350c14570a9b76f62fae991cb3f5793aa" Apr 16 20:14:07.490563 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.490529 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42d9904d644e971b09d955bfd80139350c14570a9b76f62fae991cb3f5793aa"} err="failed to get container status \"f42d9904d644e971b09d955bfd80139350c14570a9b76f62fae991cb3f5793aa\": rpc error: code = NotFound desc = could not find container \"f42d9904d644e971b09d955bfd80139350c14570a9b76f62fae991cb3f5793aa\": container with ID starting with f42d9904d644e971b09d955bfd80139350c14570a9b76f62fae991cb3f5793aa not found: ID does not exist" Apr 16 20:14:07.490563 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.490546 2569 scope.go:117] "RemoveContainer" containerID="4ffb40ffdbb598bacb2a397df810bc33e49b70d27ed41f2d8269573237f993e4" Apr 16 20:14:07.490739 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:14:07.490724 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ffb40ffdbb598bacb2a397df810bc33e49b70d27ed41f2d8269573237f993e4\": container with ID starting with 4ffb40ffdbb598bacb2a397df810bc33e49b70d27ed41f2d8269573237f993e4 not found: ID does not exist" containerID="4ffb40ffdbb598bacb2a397df810bc33e49b70d27ed41f2d8269573237f993e4" Apr 16 20:14:07.490784 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.490742 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ffb40ffdbb598bacb2a397df810bc33e49b70d27ed41f2d8269573237f993e4"} err="failed to get container status \"4ffb40ffdbb598bacb2a397df810bc33e49b70d27ed41f2d8269573237f993e4\": rpc error: code = NotFound desc = could not find container \"4ffb40ffdbb598bacb2a397df810bc33e49b70d27ed41f2d8269573237f993e4\": container with ID starting with 4ffb40ffdbb598bacb2a397df810bc33e49b70d27ed41f2d8269573237f993e4 not found: ID does not exist" Apr 16 20:14:07.490784 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.490754 2569 scope.go:117] "RemoveContainer" containerID="055f16775884f84ba7b60ebf7c9ae96d0ebb1b17f72c0fd14752e42f968fe558" Apr 16 20:14:07.490975 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:14:07.490962 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"055f16775884f84ba7b60ebf7c9ae96d0ebb1b17f72c0fd14752e42f968fe558\": container with ID starting with 055f16775884f84ba7b60ebf7c9ae96d0ebb1b17f72c0fd14752e42f968fe558 not found: ID does not exist" containerID="055f16775884f84ba7b60ebf7c9ae96d0ebb1b17f72c0fd14752e42f968fe558" Apr 16 20:14:07.491016 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.490978 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055f16775884f84ba7b60ebf7c9ae96d0ebb1b17f72c0fd14752e42f968fe558"} err="failed to get container status \"055f16775884f84ba7b60ebf7c9ae96d0ebb1b17f72c0fd14752e42f968fe558\": rpc error: code = NotFound desc = could not find container \"055f16775884f84ba7b60ebf7c9ae96d0ebb1b17f72c0fd14752e42f968fe558\": container with ID starting with 055f16775884f84ba7b60ebf7c9ae96d0ebb1b17f72c0fd14752e42f968fe558 not found: ID does not exist" Apr 16 20:14:07.491016 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.490989 2569 scope.go:117] "RemoveContainer" containerID="723d5706980a91cd7b33add32de17541d4c1d1cf3ce78d1f38913898acd3a782" Apr 16 20:14:07.491164 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:14:07.491146 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"723d5706980a91cd7b33add32de17541d4c1d1cf3ce78d1f38913898acd3a782\": container with ID starting with 723d5706980a91cd7b33add32de17541d4c1d1cf3ce78d1f38913898acd3a782 not found: ID does not exist" containerID="723d5706980a91cd7b33add32de17541d4c1d1cf3ce78d1f38913898acd3a782" Apr 16 20:14:07.491209 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.491168 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"723d5706980a91cd7b33add32de17541d4c1d1cf3ce78d1f38913898acd3a782"} err="failed to get container status \"723d5706980a91cd7b33add32de17541d4c1d1cf3ce78d1f38913898acd3a782\": rpc error: code = NotFound desc = could not find container \"723d5706980a91cd7b33add32de17541d4c1d1cf3ce78d1f38913898acd3a782\": container with ID starting with 723d5706980a91cd7b33add32de17541d4c1d1cf3ce78d1f38913898acd3a782 not found: ID does not exist" Apr 16 20:14:07.491209 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.491181 2569 scope.go:117] "RemoveContainer" containerID="576bdc962b11274b915385bb8636298fa7ee3a2f9a9c6fe15cfcd1b375fc9686" Apr 16 20:14:07.491399 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:14:07.491380 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"576bdc962b11274b915385bb8636298fa7ee3a2f9a9c6fe15cfcd1b375fc9686\": container with ID starting with 576bdc962b11274b915385bb8636298fa7ee3a2f9a9c6fe15cfcd1b375fc9686 not found: ID does not exist" containerID="576bdc962b11274b915385bb8636298fa7ee3a2f9a9c6fe15cfcd1b375fc9686" Apr 16 20:14:07.491451 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.491405 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"576bdc962b11274b915385bb8636298fa7ee3a2f9a9c6fe15cfcd1b375fc9686"} err="failed to get container status \"576bdc962b11274b915385bb8636298fa7ee3a2f9a9c6fe15cfcd1b375fc9686\": rpc error: code = NotFound desc = could not find container \"576bdc962b11274b915385bb8636298fa7ee3a2f9a9c6fe15cfcd1b375fc9686\": container with ID starting with 576bdc962b11274b915385bb8636298fa7ee3a2f9a9c6fe15cfcd1b375fc9686 not found: ID does not exist" Apr 16 20:14:07.501515 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501494 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:14:07.501719 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501707 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="kube-rbac-proxy-metric" Apr 16 20:14:07.501760 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501721 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="kube-rbac-proxy-metric" Apr 16 20:14:07.501760 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501735 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="init-config-reloader" Apr 16 20:14:07.501760 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501741 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="init-config-reloader" Apr 16 20:14:07.501760 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501747 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="prom-label-proxy" Apr 16 20:14:07.501760 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501752 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="prom-label-proxy" Apr 16 20:14:07.501760 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501760 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="alertmanager" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501765 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="alertmanager" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501770 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="kube-rbac-proxy" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501775 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="kube-rbac-proxy" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501782 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="kube-rbac-proxy-web" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501786 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="kube-rbac-proxy-web" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501793 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="config-reloader" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501798 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="config-reloader" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501805 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d103b181-6102-40b7-8f86-fc8adc683c01" containerName="console" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501809 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d103b181-6102-40b7-8f86-fc8adc683c01" containerName="console" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501844 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d103b181-6102-40b7-8f86-fc8adc683c01" containerName="console" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501852 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="config-reloader" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501859 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="kube-rbac-proxy-web" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501866 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="kube-rbac-proxy-metric" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501872 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="alertmanager" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501877 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="kube-rbac-proxy" Apr 16 20:14:07.501928 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.501882 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" containerName="prom-label-proxy" Apr 16 20:14:07.504314 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.504300 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.507046 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.507028 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 20:14:07.507133 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.507065 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 20:14:07.507622 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.507604 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 20:14:07.507622 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.507617 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 20:14:07.507762 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.507655 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 20:14:07.508032 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.508012 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 20:14:07.508333 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.508316 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5g2qj\"" Apr 16 20:14:07.508426 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.508322 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 20:14:07.508426 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.508323 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 20:14:07.513348 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.513312 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 20:14:07.520908 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.520889 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:14:07.541282 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.541258 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d15b1732-83c2-484a-aeb4-cbf9de572514-config-out\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.541380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.541290 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.541380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.541312 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-web-config\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.541499 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.541383 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.541499 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.541429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d15b1732-83c2-484a-aeb4-cbf9de572514-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.541499 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.541464 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.541499 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.541491 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqb5m\" (UniqueName: \"kubernetes.io/projected/d15b1732-83c2-484a-aeb4-cbf9de572514-kube-api-access-wqb5m\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.541683 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.541528 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.541683 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.541558 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-config-volume\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.541683 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.541582 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.541683 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.541623 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d15b1732-83c2-484a-aeb4-cbf9de572514-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.541683 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.541668 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15b1732-83c2-484a-aeb4-cbf9de572514-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.541852 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.541694 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d15b1732-83c2-484a-aeb4-cbf9de572514-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.642799 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.642756 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d15b1732-83c2-484a-aeb4-cbf9de572514-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.642799 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.642800 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15b1732-83c2-484a-aeb4-cbf9de572514-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.643011 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.642831 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d15b1732-83c2-484a-aeb4-cbf9de572514-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.643011 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.642874 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d15b1732-83c2-484a-aeb4-cbf9de572514-config-out\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.643011 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.642904 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.643011 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.642929 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-web-config\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.643011 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.642999 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.643268 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.643038 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d15b1732-83c2-484a-aeb4-cbf9de572514-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.643268 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.643075 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.643268 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.643100 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqb5m\" (UniqueName: \"kubernetes.io/projected/d15b1732-83c2-484a-aeb4-cbf9de572514-kube-api-access-wqb5m\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.643268 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.643142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.643268 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.643173 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-config-volume\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.643268 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.643197 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.643578 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.643384 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d15b1732-83c2-484a-aeb4-cbf9de572514-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.644440 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.643968 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15b1732-83c2-484a-aeb4-cbf9de572514-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.644569 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.644448 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d15b1732-83c2-484a-aeb4-cbf9de572514-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.646089 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.645996 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-web-config\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.646089 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.646005 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.646281 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.646149 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d15b1732-83c2-484a-aeb4-cbf9de572514-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.646281 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.646208 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d15b1732-83c2-484a-aeb4-cbf9de572514-config-out\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.646389 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.646322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.646471 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.646421 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.646788 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.646767 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.647506 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.647488 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.647608 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.647589 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d15b1732-83c2-484a-aeb4-cbf9de572514-config-volume\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.654880 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.654844 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqb5m\" (UniqueName: \"kubernetes.io/projected/d15b1732-83c2-484a-aeb4-cbf9de572514-kube-api-access-wqb5m\") pod \"alertmanager-main-0\" (UID: \"d15b1732-83c2-484a-aeb4-cbf9de572514\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.814014 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.813981 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:14:07.945430 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:07.945396 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:14:07.948419 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:14:07.948386 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd15b1732_83c2_484a_aeb4_cbf9de572514.slice/crio-dedac354c2dbecb24f1b796755f43268b78d9ba2776eef62791141817014db27 WatchSource:0}: Error finding container dedac354c2dbecb24f1b796755f43268b78d9ba2776eef62791141817014db27: Status 404 returned error can't find the container with id dedac354c2dbecb24f1b796755f43268b78d9ba2776eef62791141817014db27 Apr 16 20:14:08.446560 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:08.446524 2569 generic.go:358] "Generic (PLEG): container finished" podID="d15b1732-83c2-484a-aeb4-cbf9de572514" containerID="1cdf82c3e713062a5df38c6948207cf3aeb36b4c02787580c0be193ee436db0b" exitCode=0 Apr 16 20:14:08.446976 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:08.446607 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d15b1732-83c2-484a-aeb4-cbf9de572514","Type":"ContainerDied","Data":"1cdf82c3e713062a5df38c6948207cf3aeb36b4c02787580c0be193ee436db0b"} Apr 16 20:14:08.446976 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:08.446653 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d15b1732-83c2-484a-aeb4-cbf9de572514","Type":"ContainerStarted","Data":"dedac354c2dbecb24f1b796755f43268b78d9ba2776eef62791141817014db27"} Apr 16 20:14:08.950068 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:08.950031 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de260e29-0694-4f3f-ba1e-5010b43cd219" path="/var/lib/kubelet/pods/de260e29-0694-4f3f-ba1e-5010b43cd219/volumes" Apr 16 20:14:09.453479 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.453442 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d15b1732-83c2-484a-aeb4-cbf9de572514","Type":"ContainerStarted","Data":"67348398b49415389c10cbc08d6659635779a2aea2e68fa41aeb0a95ba15778c"} Apr 16 20:14:09.453864 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.453485 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d15b1732-83c2-484a-aeb4-cbf9de572514","Type":"ContainerStarted","Data":"75c83eb03c9451ad417cc3c763478e56766a60834fd1047550a3ce180503a91d"} Apr 16 20:14:09.453864 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.453499 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d15b1732-83c2-484a-aeb4-cbf9de572514","Type":"ContainerStarted","Data":"478d7b6a1385bbd1465ca766f7d6ce5d33faf7ea6641256c36ed5693c0b06c7a"} Apr 16 20:14:09.453864 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.453513 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d15b1732-83c2-484a-aeb4-cbf9de572514","Type":"ContainerStarted","Data":"be1a67265f2222aaad258f3e1fc1729e2c380594108449f81568228e6e32a5f1"} Apr 16 20:14:09.453864 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.453524 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d15b1732-83c2-484a-aeb4-cbf9de572514","Type":"ContainerStarted","Data":"b1cf2b23381ecff4ae1ed5a985a4635a954f419c4c8e488743d98f0ef9e43280"} Apr 16 20:14:09.453864 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.453536 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d15b1732-83c2-484a-aeb4-cbf9de572514","Type":"ContainerStarted","Data":"77f3a05c73e132a984cd4ea193b5238e7c8a1768c500b40411afa08ce6491f10"} Apr 16 20:14:09.490335 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.490281 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.490266862 podStartE2EDuration="2.490266862s" podCreationTimestamp="2026-04-16 20:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:09.489983894 +0000 UTC m=+137.151581383" watchObservedRunningTime="2026-04-16 20:14:09.490266862 +0000 UTC m=+137.151864351" Apr 16 20:14:09.869918 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.869885 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-659db59bd6-6x6bt"] Apr 16 20:14:09.872264 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.872244 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:09.877369 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.877349 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 20:14:09.877516 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.877350 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 20:14:09.877671 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.877646 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-w7fxw\"" Apr 16 20:14:09.877828 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.877809 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 20:14:09.877907 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.877864 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 20:14:09.878106 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.878091 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 20:14:09.889814 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.889786 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-659db59bd6-6x6bt"] Apr 16 20:14:09.895178 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.895105 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 20:14:09.961209 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.961174 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwp72\" (UniqueName: \"kubernetes.io/projected/eaa06018-7d0b-4e97-8eab-d295413238eb-kube-api-access-kwp72\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:09.961376 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.961236 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/eaa06018-7d0b-4e97-8eab-d295413238eb-secret-telemeter-client\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:09.961376 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.961306 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/eaa06018-7d0b-4e97-8eab-d295413238eb-federate-client-tls\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:09.961376 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.961353 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaa06018-7d0b-4e97-8eab-d295413238eb-telemeter-trusted-ca-bundle\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:09.961376 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.961375 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/eaa06018-7d0b-4e97-8eab-d295413238eb-telemeter-client-tls\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:09.961507 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.961391 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eaa06018-7d0b-4e97-8eab-d295413238eb-metrics-client-ca\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:09.961507 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.961416 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaa06018-7d0b-4e97-8eab-d295413238eb-serving-certs-ca-bundle\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:09.961507 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:09.961443 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eaa06018-7d0b-4e97-8eab-d295413238eb-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.062555 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.062502 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/eaa06018-7d0b-4e97-8eab-d295413238eb-federate-client-tls\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.062555 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.062562 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaa06018-7d0b-4e97-8eab-d295413238eb-telemeter-trusted-ca-bundle\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.062781 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.062588 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/eaa06018-7d0b-4e97-8eab-d295413238eb-telemeter-client-tls\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.062781 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.062613 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eaa06018-7d0b-4e97-8eab-d295413238eb-metrics-client-ca\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.062781 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.062639 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaa06018-7d0b-4e97-8eab-d295413238eb-serving-certs-ca-bundle\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.062781 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.062665 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eaa06018-7d0b-4e97-8eab-d295413238eb-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.062781 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.062748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwp72\" (UniqueName: \"kubernetes.io/projected/eaa06018-7d0b-4e97-8eab-d295413238eb-kube-api-access-kwp72\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.063012 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.062819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/eaa06018-7d0b-4e97-8eab-d295413238eb-secret-telemeter-client\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.063730 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.063700 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaa06018-7d0b-4e97-8eab-d295413238eb-serving-certs-ca-bundle\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.063844 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.063706 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eaa06018-7d0b-4e97-8eab-d295413238eb-metrics-client-ca\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.063844 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.063759 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaa06018-7d0b-4e97-8eab-d295413238eb-telemeter-trusted-ca-bundle\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.065161 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.065138 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/eaa06018-7d0b-4e97-8eab-d295413238eb-telemeter-client-tls\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.065647 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.065614 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/eaa06018-7d0b-4e97-8eab-d295413238eb-federate-client-tls\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.065733 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.065623 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/eaa06018-7d0b-4e97-8eab-d295413238eb-secret-telemeter-client\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.065733 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.065690 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eaa06018-7d0b-4e97-8eab-d295413238eb-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.075650 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.075626 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwp72\" (UniqueName: \"kubernetes.io/projected/eaa06018-7d0b-4e97-8eab-d295413238eb-kube-api-access-kwp72\") pod \"telemeter-client-659db59bd6-6x6bt\" (UID: \"eaa06018-7d0b-4e97-8eab-d295413238eb\") " pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.181746 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.181656 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" Apr 16 20:14:10.314270 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.314236 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-659db59bd6-6x6bt"] Apr 16 20:14:10.317104 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:14:10.317074 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaa06018_7d0b_4e97_8eab_d295413238eb.slice/crio-413b62137924cda951a830a2d1e51c1cf5a6f632b39435d368b8108e1b64c94c WatchSource:0}: Error finding container 413b62137924cda951a830a2d1e51c1cf5a6f632b39435d368b8108e1b64c94c: Status 404 returned error can't find the container with id 413b62137924cda951a830a2d1e51c1cf5a6f632b39435d368b8108e1b64c94c Apr 16 20:14:10.458329 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:10.458241 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" event={"ID":"eaa06018-7d0b-4e97-8eab-d295413238eb","Type":"ContainerStarted","Data":"413b62137924cda951a830a2d1e51c1cf5a6f632b39435d368b8108e1b64c94c"} Apr 16 20:14:13.468705 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:13.468661 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" event={"ID":"eaa06018-7d0b-4e97-8eab-d295413238eb","Type":"ContainerStarted","Data":"f569e791353a2fad473ca3c6c33171026ec8659f2e9f02417451e889514dd2be"} Apr 16 20:14:13.468705 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:13.468707 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" event={"ID":"eaa06018-7d0b-4e97-8eab-d295413238eb","Type":"ContainerStarted","Data":"674cb01fd3e667533b3d3368421b5c95826271afdb2956255d352bd44b37557a"} Apr 16 20:14:13.469243 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:13.468716 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" event={"ID":"eaa06018-7d0b-4e97-8eab-d295413238eb","Type":"ContainerStarted","Data":"1f22efbccc6aae42aef0a2d688baaea5f3b98f200ee98173bad25df8ad35a7aa"} Apr 16 20:14:13.492834 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:13.492768 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-659db59bd6-6x6bt" podStartSLOduration=2.259560192 podStartE2EDuration="4.492751445s" podCreationTimestamp="2026-04-16 20:14:09 +0000 UTC" firstStartedPulling="2026-04-16 20:14:10.318874442 +0000 UTC m=+137.980471913" lastFinishedPulling="2026-04-16 20:14:12.552065691 +0000 UTC m=+140.213663166" observedRunningTime="2026-04-16 20:14:13.490518517 +0000 UTC m=+141.152116005" watchObservedRunningTime="2026-04-16 20:14:13.492751445 +0000 UTC m=+141.154348935" Apr 16 20:14:14.246077 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.246040 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-747d5547f9-bpgjb"] Apr 16 20:14:14.249279 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.249261 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.253789 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.253765 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qtq7x\"" Apr 16 20:14:14.254715 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.254687 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 20:14:14.254835 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.254735 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 20:14:14.254965 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.254938 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 20:14:14.254965 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.254954 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 20:14:14.255129 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.254985 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 20:14:14.255129 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.255008 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 20:14:14.255880 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.255862 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 20:14:14.260678 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.260655 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 20:14:14.264069 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.264049 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-747d5547f9-bpgjb"] Apr 16 20:14:14.296172 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.296136 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-service-ca\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.296172 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.296173 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0519aee2-e581-4c17-97b2-c083989c38e5-console-serving-cert\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.296369 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.296194 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp225\" (UniqueName: \"kubernetes.io/projected/0519aee2-e581-4c17-97b2-c083989c38e5-kube-api-access-lp225\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.296369 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.296274 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-console-config\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.296369 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.296321 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0519aee2-e581-4c17-97b2-c083989c38e5-console-oauth-config\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.296456 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.296386 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-oauth-serving-cert\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.296456 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.296424 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-trusted-ca-bundle\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.397144 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.397107 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0519aee2-e581-4c17-97b2-c083989c38e5-console-serving-cert\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.397342 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.397150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lp225\" (UniqueName: \"kubernetes.io/projected/0519aee2-e581-4c17-97b2-c083989c38e5-kube-api-access-lp225\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.397342 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.397181 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-console-config\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.397342 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.397239 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0519aee2-e581-4c17-97b2-c083989c38e5-console-oauth-config\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.397342 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.397287 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-oauth-serving-cert\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.397342 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.397319 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-trusted-ca-bundle\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.397560 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.397363 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-service-ca\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.398008 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.397984 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-service-ca\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.398101 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.398053 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-console-config\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.398101 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.398053 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-oauth-serving-cert\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.398298 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.398276 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-trusted-ca-bundle\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.399571 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.399550 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0519aee2-e581-4c17-97b2-c083989c38e5-console-oauth-config\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.399750 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.399732 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0519aee2-e581-4c17-97b2-c083989c38e5-console-serving-cert\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.407152 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.407131 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp225\" (UniqueName: \"kubernetes.io/projected/0519aee2-e581-4c17-97b2-c083989c38e5-kube-api-access-lp225\") pod \"console-747d5547f9-bpgjb\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.559093 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.558998 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:14.674253 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:14.674232 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-747d5547f9-bpgjb"] Apr 16 20:14:14.676574 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:14:14.676546 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0519aee2_e581_4c17_97b2_c083989c38e5.slice/crio-2e5710fbb53500ac83edec8414627d74a90b3e00e43267d62beb31a428bb4d90 WatchSource:0}: Error finding container 2e5710fbb53500ac83edec8414627d74a90b3e00e43267d62beb31a428bb4d90: Status 404 returned error can't find the container with id 2e5710fbb53500ac83edec8414627d74a90b3e00e43267d62beb31a428bb4d90 Apr 16 20:14:15.475482 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:15.475442 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-747d5547f9-bpgjb" event={"ID":"0519aee2-e581-4c17-97b2-c083989c38e5","Type":"ContainerStarted","Data":"268845534b8022b7e1051a639c570fe389b9a390a4e0aac744e6406f671ed040"} Apr 16 20:14:15.475482 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:15.475483 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-747d5547f9-bpgjb" event={"ID":"0519aee2-e581-4c17-97b2-c083989c38e5","Type":"ContainerStarted","Data":"2e5710fbb53500ac83edec8414627d74a90b3e00e43267d62beb31a428bb4d90"} Apr 16 20:14:15.493791 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:15.493746 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-747d5547f9-bpgjb" podStartSLOduration=1.493732549 podStartE2EDuration="1.493732549s" podCreationTimestamp="2026-04-16 20:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:15.492848759 +0000 UTC m=+143.154446247" watchObservedRunningTime="2026-04-16 20:14:15.493732549 +0000 UTC m=+143.155330039" Apr 16 20:14:24.559818 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:24.559783 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:24.560303 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:24.559865 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:24.564573 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:24.564553 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:14:25.505193 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:14:25.505165 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:15:27.692857 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.692817 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6dbb4d86bc-vm9w8"] Apr 16 20:15:27.696052 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.696031 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.713445 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.713423 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dbb4d86bc-vm9w8"] Apr 16 20:15:27.763078 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.763046 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-config\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.763078 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.763081 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-oauth-config\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.763319 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.763118 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sdpj\" (UniqueName: \"kubernetes.io/projected/f6ad24f9-fa9c-40be-921a-979c72985ebc-kube-api-access-4sdpj\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.763319 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.763142 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-oauth-serving-cert\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.763319 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.763171 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-trusted-ca-bundle\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.763319 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.763192 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-serving-cert\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.763319 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.763250 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-service-ca\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.864179 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.864142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-serving-cert\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.864383 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.864192 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-service-ca\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.864383 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.864227 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-config\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.864383 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.864351 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-oauth-config\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.864539 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.864442 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sdpj\" (UniqueName: \"kubernetes.io/projected/f6ad24f9-fa9c-40be-921a-979c72985ebc-kube-api-access-4sdpj\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.864539 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.864497 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-oauth-serving-cert\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.864640 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.864542 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-trusted-ca-bundle\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.864909 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.864882 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-config\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.865049 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.864960 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-service-ca\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.865197 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.865175 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-oauth-serving-cert\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.865308 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.865290 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-trusted-ca-bundle\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.867150 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.867132 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-oauth-config\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.867400 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.867378 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-serving-cert\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:27.882195 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:27.882163 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sdpj\" (UniqueName: \"kubernetes.io/projected/f6ad24f9-fa9c-40be-921a-979c72985ebc-kube-api-access-4sdpj\") pod \"console-6dbb4d86bc-vm9w8\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:28.005024 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:28.004986 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:28.125654 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:28.125620 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dbb4d86bc-vm9w8"] Apr 16 20:15:28.128080 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:15:28.128053 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6ad24f9_fa9c_40be_921a_979c72985ebc.slice/crio-bb7a295935ab89d46892b0ebfc49b4acb8e7d50c37d95a5bb67ec281da9dfece WatchSource:0}: Error finding container bb7a295935ab89d46892b0ebfc49b4acb8e7d50c37d95a5bb67ec281da9dfece: Status 404 returned error can't find the container with id bb7a295935ab89d46892b0ebfc49b4acb8e7d50c37d95a5bb67ec281da9dfece Apr 16 20:15:28.667890 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:28.667855 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dbb4d86bc-vm9w8" event={"ID":"f6ad24f9-fa9c-40be-921a-979c72985ebc","Type":"ContainerStarted","Data":"92b846bad142bc043291be42fb4b9d2f98c58be675a671dd553f250092cd4cc0"} Apr 16 20:15:28.667890 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:28.667890 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dbb4d86bc-vm9w8" event={"ID":"f6ad24f9-fa9c-40be-921a-979c72985ebc","Type":"ContainerStarted","Data":"bb7a295935ab89d46892b0ebfc49b4acb8e7d50c37d95a5bb67ec281da9dfece"} Apr 16 20:15:28.695721 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:28.695670 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6dbb4d86bc-vm9w8" podStartSLOduration=1.695655793 podStartE2EDuration="1.695655793s" podCreationTimestamp="2026-04-16 20:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:15:28.693893109 +0000 UTC m=+216.355490610" watchObservedRunningTime="2026-04-16 20:15:28.695655793 +0000 UTC m=+216.357253281" Apr 16 20:15:38.005910 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:38.005875 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:38.006402 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:38.005921 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:38.011428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:38.011402 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:38.699465 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:38.699437 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:15:38.754097 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:38.754064 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-747d5547f9-bpgjb"] Apr 16 20:15:48.326874 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.326836 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qhfz4"] Apr 16 20:15:48.330089 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.330071 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfz4" Apr 16 20:15:48.335453 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.335435 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:15:48.339418 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.339392 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qhfz4"] Apr 16 20:15:48.442241 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.442173 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64a16ceb-71a8-4203-b254-e617d6a240e4-original-pull-secret\") pod \"global-pull-secret-syncer-qhfz4\" (UID: \"64a16ceb-71a8-4203-b254-e617d6a240e4\") " pod="kube-system/global-pull-secret-syncer-qhfz4" Apr 16 20:15:48.442419 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.442258 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/64a16ceb-71a8-4203-b254-e617d6a240e4-kubelet-config\") pod \"global-pull-secret-syncer-qhfz4\" (UID: \"64a16ceb-71a8-4203-b254-e617d6a240e4\") " pod="kube-system/global-pull-secret-syncer-qhfz4" Apr 16 20:15:48.442419 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.442282 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/64a16ceb-71a8-4203-b254-e617d6a240e4-dbus\") pod \"global-pull-secret-syncer-qhfz4\" (UID: \"64a16ceb-71a8-4203-b254-e617d6a240e4\") " pod="kube-system/global-pull-secret-syncer-qhfz4" Apr 16 20:15:48.543706 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.543672 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64a16ceb-71a8-4203-b254-e617d6a240e4-original-pull-secret\") pod \"global-pull-secret-syncer-qhfz4\" (UID: \"64a16ceb-71a8-4203-b254-e617d6a240e4\") " pod="kube-system/global-pull-secret-syncer-qhfz4" Apr 16 20:15:48.543838 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.543725 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/64a16ceb-71a8-4203-b254-e617d6a240e4-kubelet-config\") pod \"global-pull-secret-syncer-qhfz4\" (UID: \"64a16ceb-71a8-4203-b254-e617d6a240e4\") " pod="kube-system/global-pull-secret-syncer-qhfz4" Apr 16 20:15:48.543838 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.543751 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/64a16ceb-71a8-4203-b254-e617d6a240e4-dbus\") pod \"global-pull-secret-syncer-qhfz4\" (UID: \"64a16ceb-71a8-4203-b254-e617d6a240e4\") " pod="kube-system/global-pull-secret-syncer-qhfz4" Apr 16 20:15:48.543947 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.543877 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/64a16ceb-71a8-4203-b254-e617d6a240e4-kubelet-config\") pod \"global-pull-secret-syncer-qhfz4\" (UID: \"64a16ceb-71a8-4203-b254-e617d6a240e4\") " pod="kube-system/global-pull-secret-syncer-qhfz4" Apr 16 20:15:48.543947 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.543890 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/64a16ceb-71a8-4203-b254-e617d6a240e4-dbus\") pod \"global-pull-secret-syncer-qhfz4\" (UID: \"64a16ceb-71a8-4203-b254-e617d6a240e4\") " pod="kube-system/global-pull-secret-syncer-qhfz4" Apr 16 20:15:48.545988 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.545970 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64a16ceb-71a8-4203-b254-e617d6a240e4-original-pull-secret\") pod \"global-pull-secret-syncer-qhfz4\" (UID: \"64a16ceb-71a8-4203-b254-e617d6a240e4\") " pod="kube-system/global-pull-secret-syncer-qhfz4" Apr 16 20:15:48.640346 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.640264 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qhfz4" Apr 16 20:15:48.760340 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:48.760316 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qhfz4"] Apr 16 20:15:48.762709 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:15:48.762681 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64a16ceb_71a8_4203_b254_e617d6a240e4.slice/crio-58321add84a3cc45f95dc6610be4da6298bcb8f57332b77ea319fb07bfde0163 WatchSource:0}: Error finding container 58321add84a3cc45f95dc6610be4da6298bcb8f57332b77ea319fb07bfde0163: Status 404 returned error can't find the container with id 58321add84a3cc45f95dc6610be4da6298bcb8f57332b77ea319fb07bfde0163 Apr 16 20:15:49.728338 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:49.728290 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qhfz4" event={"ID":"64a16ceb-71a8-4203-b254-e617d6a240e4","Type":"ContainerStarted","Data":"58321add84a3cc45f95dc6610be4da6298bcb8f57332b77ea319fb07bfde0163"} Apr 16 20:15:52.738695 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:52.738656 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qhfz4" event={"ID":"64a16ceb-71a8-4203-b254-e617d6a240e4","Type":"ContainerStarted","Data":"f0a3ff0ff70cb3586d90aef75a3ce47e214fde960943e8bf4196a712f9ee68ee"} Apr 16 20:15:52.766004 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:15:52.765948 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qhfz4" podStartSLOduration=1.074135787 podStartE2EDuration="4.765933155s" podCreationTimestamp="2026-04-16 20:15:48 +0000 UTC" firstStartedPulling="2026-04-16 20:15:48.764334212 +0000 UTC m=+236.425931693" lastFinishedPulling="2026-04-16 20:15:52.456131591 +0000 UTC m=+240.117729061" observedRunningTime="2026-04-16 20:15:52.764441744 +0000 UTC m=+240.426039232" watchObservedRunningTime="2026-04-16 20:15:52.765933155 +0000 UTC m=+240.427530622" Apr 16 20:16:03.774735 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:03.774626 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-747d5547f9-bpgjb" podUID="0519aee2-e581-4c17-97b2-c083989c38e5" containerName="console" containerID="cri-o://268845534b8022b7e1051a639c570fe389b9a390a4e0aac744e6406f671ed040" gracePeriod=15 Apr 16 20:16:04.011776 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.011753 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-747d5547f9-bpgjb_0519aee2-e581-4c17-97b2-c083989c38e5/console/0.log" Apr 16 20:16:04.011894 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.011814 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:16:04.069719 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.069631 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0519aee2-e581-4c17-97b2-c083989c38e5-console-oauth-config\") pod \"0519aee2-e581-4c17-97b2-c083989c38e5\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " Apr 16 20:16:04.069719 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.069691 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-trusted-ca-bundle\") pod \"0519aee2-e581-4c17-97b2-c083989c38e5\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " Apr 16 20:16:04.069942 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.069726 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-console-config\") pod \"0519aee2-e581-4c17-97b2-c083989c38e5\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " Apr 16 20:16:04.069942 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.069771 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-oauth-serving-cert\") pod \"0519aee2-e581-4c17-97b2-c083989c38e5\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " Apr 16 20:16:04.069942 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.069795 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-service-ca\") pod \"0519aee2-e581-4c17-97b2-c083989c38e5\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " Apr 16 20:16:04.069942 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.069831 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp225\" (UniqueName: \"kubernetes.io/projected/0519aee2-e581-4c17-97b2-c083989c38e5-kube-api-access-lp225\") pod \"0519aee2-e581-4c17-97b2-c083989c38e5\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " Apr 16 20:16:04.070139 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.070001 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0519aee2-e581-4c17-97b2-c083989c38e5-console-serving-cert\") pod \"0519aee2-e581-4c17-97b2-c083989c38e5\" (UID: \"0519aee2-e581-4c17-97b2-c083989c38e5\") " Apr 16 20:16:04.070259 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.070200 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-service-ca" (OuterVolumeSpecName: "service-ca") pod "0519aee2-e581-4c17-97b2-c083989c38e5" (UID: "0519aee2-e581-4c17-97b2-c083989c38e5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:04.070259 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.070210 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0519aee2-e581-4c17-97b2-c083989c38e5" (UID: "0519aee2-e581-4c17-97b2-c083989c38e5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:04.070259 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.070242 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-console-config" (OuterVolumeSpecName: "console-config") pod "0519aee2-e581-4c17-97b2-c083989c38e5" (UID: "0519aee2-e581-4c17-97b2-c083989c38e5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:04.070445 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.070278 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0519aee2-e581-4c17-97b2-c083989c38e5" (UID: "0519aee2-e581-4c17-97b2-c083989c38e5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:04.071919 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.071894 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0519aee2-e581-4c17-97b2-c083989c38e5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0519aee2-e581-4c17-97b2-c083989c38e5" (UID: "0519aee2-e581-4c17-97b2-c083989c38e5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:04.072049 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.072025 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0519aee2-e581-4c17-97b2-c083989c38e5-kube-api-access-lp225" (OuterVolumeSpecName: "kube-api-access-lp225") pod "0519aee2-e581-4c17-97b2-c083989c38e5" (UID: "0519aee2-e581-4c17-97b2-c083989c38e5"). InnerVolumeSpecName "kube-api-access-lp225". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:04.072157 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.072141 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0519aee2-e581-4c17-97b2-c083989c38e5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0519aee2-e581-4c17-97b2-c083989c38e5" (UID: "0519aee2-e581-4c17-97b2-c083989c38e5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:04.171356 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.171323 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-console-config\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.171356 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.171350 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-oauth-serving-cert\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.171356 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.171362 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-service-ca\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.171584 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.171372 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lp225\" (UniqueName: \"kubernetes.io/projected/0519aee2-e581-4c17-97b2-c083989c38e5-kube-api-access-lp225\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.171584 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.171382 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0519aee2-e581-4c17-97b2-c083989c38e5-console-serving-cert\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.171584 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.171391 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0519aee2-e581-4c17-97b2-c083989c38e5-console-oauth-config\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.171584 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.171400 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0519aee2-e581-4c17-97b2-c083989c38e5-trusted-ca-bundle\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:16:04.772071 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.772042 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-747d5547f9-bpgjb_0519aee2-e581-4c17-97b2-c083989c38e5/console/0.log" Apr 16 20:16:04.772259 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.772082 2569 generic.go:358] "Generic (PLEG): container finished" podID="0519aee2-e581-4c17-97b2-c083989c38e5" containerID="268845534b8022b7e1051a639c570fe389b9a390a4e0aac744e6406f671ed040" exitCode=2 Apr 16 20:16:04.772259 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.772181 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-747d5547f9-bpgjb" Apr 16 20:16:04.772259 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.772183 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-747d5547f9-bpgjb" event={"ID":"0519aee2-e581-4c17-97b2-c083989c38e5","Type":"ContainerDied","Data":"268845534b8022b7e1051a639c570fe389b9a390a4e0aac744e6406f671ed040"} Apr 16 20:16:04.772259 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.772241 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-747d5547f9-bpgjb" event={"ID":"0519aee2-e581-4c17-97b2-c083989c38e5","Type":"ContainerDied","Data":"2e5710fbb53500ac83edec8414627d74a90b3e00e43267d62beb31a428bb4d90"} Apr 16 20:16:04.772259 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.772260 2569 scope.go:117] "RemoveContainer" containerID="268845534b8022b7e1051a639c570fe389b9a390a4e0aac744e6406f671ed040" Apr 16 20:16:04.780444 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.780263 2569 scope.go:117] "RemoveContainer" containerID="268845534b8022b7e1051a639c570fe389b9a390a4e0aac744e6406f671ed040" Apr 16 20:16:04.780739 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:04.780602 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268845534b8022b7e1051a639c570fe389b9a390a4e0aac744e6406f671ed040\": container with ID starting with 268845534b8022b7e1051a639c570fe389b9a390a4e0aac744e6406f671ed040 not found: ID does not exist" containerID="268845534b8022b7e1051a639c570fe389b9a390a4e0aac744e6406f671ed040" Apr 16 20:16:04.780739 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.780627 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268845534b8022b7e1051a639c570fe389b9a390a4e0aac744e6406f671ed040"} err="failed to get container status \"268845534b8022b7e1051a639c570fe389b9a390a4e0aac744e6406f671ed040\": rpc error: code = NotFound desc = could not find container \"268845534b8022b7e1051a639c570fe389b9a390a4e0aac744e6406f671ed040\": container with ID starting with 268845534b8022b7e1051a639c570fe389b9a390a4e0aac744e6406f671ed040 not found: ID does not exist" Apr 16 20:16:04.795264 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.793088 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-747d5547f9-bpgjb"] Apr 16 20:16:04.802935 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.802913 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-747d5547f9-bpgjb"] Apr 16 20:16:04.949728 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:04.949691 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0519aee2-e581-4c17-97b2-c083989c38e5" path="/var/lib/kubelet/pods/0519aee2-e581-4c17-97b2-c083989c38e5/volumes" Apr 16 20:16:06.949323 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:06.949295 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws"] Apr 16 20:16:06.949769 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:06.949647 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0519aee2-e581-4c17-97b2-c083989c38e5" containerName="console" Apr 16 20:16:06.949769 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:06.949664 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0519aee2-e581-4c17-97b2-c083989c38e5" containerName="console" Apr 16 20:16:06.949769 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:06.949748 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0519aee2-e581-4c17-97b2-c083989c38e5" containerName="console" Apr 16 20:16:06.954599 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:06.954578 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" Apr 16 20:16:06.957133 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:06.957113 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:16:06.957265 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:06.957140 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:16:06.957265 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:06.957119 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4lrfh\"" Apr 16 20:16:06.960131 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:06.960106 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws"] Apr 16 20:16:06.990854 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:06.990821 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5jfc\" (UniqueName: \"kubernetes.io/projected/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-kube-api-access-c5jfc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws\" (UID: \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" Apr 16 20:16:06.990991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:06.990885 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws\" (UID: \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" Apr 16 20:16:06.991093 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:06.991071 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws\" (UID: \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" Apr 16 20:16:07.092348 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:07.092313 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5jfc\" (UniqueName: \"kubernetes.io/projected/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-kube-api-access-c5jfc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws\" (UID: \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" Apr 16 20:16:07.092505 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:07.092358 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws\" (UID: \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" Apr 16 20:16:07.092505 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:07.092417 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws\" (UID: \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" Apr 16 20:16:07.092811 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:07.092789 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws\" (UID: \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" Apr 16 20:16:07.093249 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:07.093234 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws\" (UID: \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" Apr 16 20:16:07.100855 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:07.100823 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5jfc\" (UniqueName: \"kubernetes.io/projected/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-kube-api-access-c5jfc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws\" (UID: \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" Apr 16 20:16:07.264355 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:07.264325 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" Apr 16 20:16:07.387729 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:07.387700 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws"] Apr 16 20:16:07.390750 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:16:07.390722 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa9089f6_3309_41ae_8d24_2ccaa983f8b9.slice/crio-895013cd6e30d4f1c52db646c341eb65f4fa7c0cee279566d324dc4cd9b7a4c7 WatchSource:0}: Error finding container 895013cd6e30d4f1c52db646c341eb65f4fa7c0cee279566d324dc4cd9b7a4c7: Status 404 returned error can't find the container with id 895013cd6e30d4f1c52db646c341eb65f4fa7c0cee279566d324dc4cd9b7a4c7 Apr 16 20:16:07.782799 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:07.782765 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" event={"ID":"aa9089f6-3309-41ae-8d24-2ccaa983f8b9","Type":"ContainerStarted","Data":"895013cd6e30d4f1c52db646c341eb65f4fa7c0cee279566d324dc4cd9b7a4c7"} Apr 16 20:16:13.802975 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:13.802881 2569 generic.go:358] "Generic (PLEG): container finished" podID="aa9089f6-3309-41ae-8d24-2ccaa983f8b9" containerID="77525eff4b34bf9d030f72908ed8cbc6d070d153f65ba0320453fe319038d411" exitCode=0 Apr 16 20:16:13.803451 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:13.802979 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" event={"ID":"aa9089f6-3309-41ae-8d24-2ccaa983f8b9","Type":"ContainerDied","Data":"77525eff4b34bf9d030f72908ed8cbc6d070d153f65ba0320453fe319038d411"} Apr 16 20:16:16.811682 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:16.811644 2569 generic.go:358] "Generic (PLEG): container finished" podID="aa9089f6-3309-41ae-8d24-2ccaa983f8b9" containerID="c0dd961b591e1b99004e4445726aa841eeceffb7099447a9a8db53c1054b4092" exitCode=0 Apr 16 20:16:16.812144 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:16.811734 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" event={"ID":"aa9089f6-3309-41ae-8d24-2ccaa983f8b9","Type":"ContainerDied","Data":"c0dd961b591e1b99004e4445726aa841eeceffb7099447a9a8db53c1054b4092"} Apr 16 20:16:24.836799 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:24.836759 2569 generic.go:358] "Generic (PLEG): container finished" podID="aa9089f6-3309-41ae-8d24-2ccaa983f8b9" containerID="c003d823093fb22d74245b95c6942fbca9726b2ebffd5fa0520fb672487a81b4" exitCode=0 Apr 16 20:16:24.837264 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:24.836798 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" event={"ID":"aa9089f6-3309-41ae-8d24-2ccaa983f8b9","Type":"ContainerDied","Data":"c003d823093fb22d74245b95c6942fbca9726b2ebffd5fa0520fb672487a81b4"} Apr 16 20:16:25.955625 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:25.955599 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" Apr 16 20:16:26.061484 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:26.061437 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5jfc\" (UniqueName: \"kubernetes.io/projected/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-kube-api-access-c5jfc\") pod \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\" (UID: \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\") " Apr 16 20:16:26.061684 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:26.061507 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-util\") pod \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\" (UID: \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\") " Apr 16 20:16:26.061684 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:26.061563 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-bundle\") pod \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\" (UID: \"aa9089f6-3309-41ae-8d24-2ccaa983f8b9\") " Apr 16 20:16:26.062288 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:26.062261 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-bundle" (OuterVolumeSpecName: "bundle") pod "aa9089f6-3309-41ae-8d24-2ccaa983f8b9" (UID: "aa9089f6-3309-41ae-8d24-2ccaa983f8b9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:26.063746 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:26.063718 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-kube-api-access-c5jfc" (OuterVolumeSpecName: "kube-api-access-c5jfc") pod "aa9089f6-3309-41ae-8d24-2ccaa983f8b9" (UID: "aa9089f6-3309-41ae-8d24-2ccaa983f8b9"). InnerVolumeSpecName "kube-api-access-c5jfc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:26.065715 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:26.065696 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-util" (OuterVolumeSpecName: "util") pod "aa9089f6-3309-41ae-8d24-2ccaa983f8b9" (UID: "aa9089f6-3309-41ae-8d24-2ccaa983f8b9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:26.163053 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:26.162971 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-bundle\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:16:26.163053 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:26.162997 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c5jfc\" (UniqueName: \"kubernetes.io/projected/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-kube-api-access-c5jfc\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:16:26.163053 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:26.163009 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa9089f6-3309-41ae-8d24-2ccaa983f8b9-util\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:16:26.843845 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:26.843819 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" Apr 16 20:16:26.844006 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:26.843818 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2c9ws" event={"ID":"aa9089f6-3309-41ae-8d24-2ccaa983f8b9","Type":"ContainerDied","Data":"895013cd6e30d4f1c52db646c341eb65f4fa7c0cee279566d324dc4cd9b7a4c7"} Apr 16 20:16:26.844006 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:26.843923 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="895013cd6e30d4f1c52db646c341eb65f4fa7c0cee279566d324dc4cd9b7a4c7" Apr 16 20:16:28.815730 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.815696 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj"] Apr 16 20:16:28.816087 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.816071 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa9089f6-3309-41ae-8d24-2ccaa983f8b9" containerName="util" Apr 16 20:16:28.816087 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.816086 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa9089f6-3309-41ae-8d24-2ccaa983f8b9" containerName="util" Apr 16 20:16:28.816170 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.816110 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa9089f6-3309-41ae-8d24-2ccaa983f8b9" containerName="extract" Apr 16 20:16:28.816170 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.816120 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa9089f6-3309-41ae-8d24-2ccaa983f8b9" containerName="extract" Apr 16 20:16:28.816170 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.816145 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa9089f6-3309-41ae-8d24-2ccaa983f8b9" containerName="pull" Apr 16 20:16:28.816170 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.816153 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa9089f6-3309-41ae-8d24-2ccaa983f8b9" containerName="pull" Apr 16 20:16:28.816323 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.816231 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa9089f6-3309-41ae-8d24-2ccaa983f8b9" containerName="extract" Apr 16 20:16:28.864451 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.864414 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj"] Apr 16 20:16:28.864606 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.864528 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj" Apr 16 20:16:28.866724 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.866703 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-xts98\"" Apr 16 20:16:28.866924 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.866909 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 20:16:28.866974 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.866952 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 20:16:28.866974 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.866970 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 20:16:28.983302 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.983266 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77vkf\" (UniqueName: \"kubernetes.io/projected/b326663d-4aa9-4f0e-9149-4c87414dce22-kube-api-access-77vkf\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj\" (UID: \"b326663d-4aa9-4f0e-9149-4c87414dce22\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj" Apr 16 20:16:28.983463 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:28.983346 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b326663d-4aa9-4f0e-9149-4c87414dce22-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj\" (UID: \"b326663d-4aa9-4f0e-9149-4c87414dce22\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj" Apr 16 20:16:29.083773 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:29.083690 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b326663d-4aa9-4f0e-9149-4c87414dce22-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj\" (UID: \"b326663d-4aa9-4f0e-9149-4c87414dce22\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj" Apr 16 20:16:29.083926 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:29.083867 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77vkf\" (UniqueName: \"kubernetes.io/projected/b326663d-4aa9-4f0e-9149-4c87414dce22-kube-api-access-77vkf\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj\" (UID: \"b326663d-4aa9-4f0e-9149-4c87414dce22\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj" Apr 16 20:16:29.086129 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:29.086103 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b326663d-4aa9-4f0e-9149-4c87414dce22-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj\" (UID: \"b326663d-4aa9-4f0e-9149-4c87414dce22\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj" Apr 16 20:16:29.093805 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:29.093779 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77vkf\" (UniqueName: \"kubernetes.io/projected/b326663d-4aa9-4f0e-9149-4c87414dce22-kube-api-access-77vkf\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj\" (UID: \"b326663d-4aa9-4f0e-9149-4c87414dce22\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj" Apr 16 20:16:29.174678 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:29.174645 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj" Apr 16 20:16:29.308731 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:29.308698 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj"] Apr 16 20:16:29.312811 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:16:29.312774 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb326663d_4aa9_4f0e_9149_4c87414dce22.slice/crio-7b106299d944036747b0d77f9728c6e07d33c176a789e1093c4bf0c152c2582b WatchSource:0}: Error finding container 7b106299d944036747b0d77f9728c6e07d33c176a789e1093c4bf0c152c2582b: Status 404 returned error can't find the container with id 7b106299d944036747b0d77f9728c6e07d33c176a789e1093c4bf0c152c2582b Apr 16 20:16:29.852712 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:29.852678 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj" event={"ID":"b326663d-4aa9-4f0e-9149-4c87414dce22","Type":"ContainerStarted","Data":"7b106299d944036747b0d77f9728c6e07d33c176a789e1093c4bf0c152c2582b"} Apr 16 20:16:32.865091 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:32.865008 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj" event={"ID":"b326663d-4aa9-4f0e-9149-4c87414dce22","Type":"ContainerStarted","Data":"9da0dc10ce564ca780b3716ccdc01dc313b0359a7a1e30b4d96d47ccd972e8af"} Apr 16 20:16:32.865460 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:32.865132 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj" Apr 16 20:16:32.885482 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:32.885435 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj" podStartSLOduration=1.7106282739999998 podStartE2EDuration="4.885421431s" podCreationTimestamp="2026-04-16 20:16:28 +0000 UTC" firstStartedPulling="2026-04-16 20:16:29.314651791 +0000 UTC m=+276.976249258" lastFinishedPulling="2026-04-16 20:16:32.489444944 +0000 UTC m=+280.151042415" observedRunningTime="2026-04-16 20:16:32.884709688 +0000 UTC m=+280.546307179" watchObservedRunningTime="2026-04-16 20:16:32.885421431 +0000 UTC m=+280.547018919" Apr 16 20:16:33.045067 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.045027 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-s9g96"] Apr 16 20:16:33.048280 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.048259 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:33.050859 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.050834 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 20:16:33.050987 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.050866 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 20:16:33.050987 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.050886 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-crtrl\"" Apr 16 20:16:33.056707 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.056686 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-s9g96"] Apr 16 20:16:33.224875 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.224796 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7thx\" (UniqueName: \"kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-kube-api-access-q7thx\") pod \"keda-operator-ffbb595cb-s9g96\" (UID: \"ed17ce31-2aab-456d-992c-884e0d92416e\") " pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:33.224875 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.224829 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-certificates\") pod \"keda-operator-ffbb595cb-s9g96\" (UID: \"ed17ce31-2aab-456d-992c-884e0d92416e\") " pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:33.225082 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.224976 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ed17ce31-2aab-456d-992c-884e0d92416e-cabundle0\") pod \"keda-operator-ffbb595cb-s9g96\" (UID: \"ed17ce31-2aab-456d-992c-884e0d92416e\") " pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:33.248405 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.248371 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq"] Apr 16 20:16:33.251497 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.251479 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:33.253616 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.253596 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 20:16:33.261154 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.261126 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq"] Apr 16 20:16:33.326001 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.325966 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7thx\" (UniqueName: \"kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-kube-api-access-q7thx\") pod \"keda-operator-ffbb595cb-s9g96\" (UID: \"ed17ce31-2aab-456d-992c-884e0d92416e\") " pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:33.326173 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.326007 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-certificates\") pod \"keda-operator-ffbb595cb-s9g96\" (UID: \"ed17ce31-2aab-456d-992c-884e0d92416e\") " pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:33.326173 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.326091 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ed17ce31-2aab-456d-992c-884e0d92416e-cabundle0\") pod \"keda-operator-ffbb595cb-s9g96\" (UID: \"ed17ce31-2aab-456d-992c-884e0d92416e\") " pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:33.326288 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.326179 2569 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 20:16:33.326288 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.326202 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 20:16:33.326288 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.326233 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 20:16:33.326288 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.326249 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-s9g96: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 20:16:33.326431 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.326302 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-certificates podName:ed17ce31-2aab-456d-992c-884e0d92416e nodeName:}" failed. No retries permitted until 2026-04-16 20:16:33.826285711 +0000 UTC m=+281.487883178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-certificates") pod "keda-operator-ffbb595cb-s9g96" (UID: "ed17ce31-2aab-456d-992c-884e0d92416e") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 20:16:33.326702 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.326685 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ed17ce31-2aab-456d-992c-884e0d92416e-cabundle0\") pod \"keda-operator-ffbb595cb-s9g96\" (UID: \"ed17ce31-2aab-456d-992c-884e0d92416e\") " pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:33.336051 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.336027 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7thx\" (UniqueName: \"kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-kube-api-access-q7thx\") pod \"keda-operator-ffbb595cb-s9g96\" (UID: \"ed17ce31-2aab-456d-992c-884e0d92416e\") " pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:33.426461 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.426427 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c9a0b8b8-3160-4b21-a348-9fef7d101905-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-59ztq\" (UID: \"c9a0b8b8-3160-4b21-a348-9fef7d101905\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:33.426667 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.426470 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-certificates\") pod \"keda-metrics-apiserver-7c9f485588-59ztq\" (UID: \"c9a0b8b8-3160-4b21-a348-9fef7d101905\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:33.426667 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.426555 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-789nq\" (UniqueName: \"kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-kube-api-access-789nq\") pod \"keda-metrics-apiserver-7c9f485588-59ztq\" (UID: \"c9a0b8b8-3160-4b21-a348-9fef7d101905\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:33.527390 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.527359 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c9a0b8b8-3160-4b21-a348-9fef7d101905-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-59ztq\" (UID: \"c9a0b8b8-3160-4b21-a348-9fef7d101905\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:33.527566 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.527404 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-certificates\") pod \"keda-metrics-apiserver-7c9f485588-59ztq\" (UID: \"c9a0b8b8-3160-4b21-a348-9fef7d101905\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:33.527566 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.527468 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-789nq\" (UniqueName: \"kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-kube-api-access-789nq\") pod \"keda-metrics-apiserver-7c9f485588-59ztq\" (UID: \"c9a0b8b8-3160-4b21-a348-9fef7d101905\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:33.527691 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.527573 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 20:16:33.527691 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.527592 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 20:16:33.527691 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.527615 2569 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 20:16:33.527691 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.527635 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 20:16:33.527869 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.527712 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-certificates podName:c9a0b8b8-3160-4b21-a348-9fef7d101905 nodeName:}" failed. No retries permitted until 2026-04-16 20:16:34.027690167 +0000 UTC m=+281.689287634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-certificates") pod "keda-metrics-apiserver-7c9f485588-59ztq" (UID: "c9a0b8b8-3160-4b21-a348-9fef7d101905") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 20:16:33.527869 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.527719 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c9a0b8b8-3160-4b21-a348-9fef7d101905-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-59ztq\" (UID: \"c9a0b8b8-3160-4b21-a348-9fef7d101905\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:33.542361 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.542333 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-789nq\" (UniqueName: \"kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-kube-api-access-789nq\") pod \"keda-metrics-apiserver-7c9f485588-59ztq\" (UID: \"c9a0b8b8-3160-4b21-a348-9fef7d101905\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:33.569741 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.569712 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-nn2zl"] Apr 16 20:16:33.573260 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.573239 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-nn2zl" Apr 16 20:16:33.575872 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.575855 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 20:16:33.587230 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.587197 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-nn2zl"] Apr 16 20:16:33.729176 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.729127 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3327532f-0ac5-4e45-9344-4cef9f84da7c-certificates\") pod \"keda-admission-cf49989db-nn2zl\" (UID: \"3327532f-0ac5-4e45-9344-4cef9f84da7c\") " pod="openshift-keda/keda-admission-cf49989db-nn2zl" Apr 16 20:16:33.729356 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.729281 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b56kw\" (UniqueName: \"kubernetes.io/projected/3327532f-0ac5-4e45-9344-4cef9f84da7c-kube-api-access-b56kw\") pod \"keda-admission-cf49989db-nn2zl\" (UID: \"3327532f-0ac5-4e45-9344-4cef9f84da7c\") " pod="openshift-keda/keda-admission-cf49989db-nn2zl" Apr 16 20:16:33.830066 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.829983 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3327532f-0ac5-4e45-9344-4cef9f84da7c-certificates\") pod \"keda-admission-cf49989db-nn2zl\" (UID: \"3327532f-0ac5-4e45-9344-4cef9f84da7c\") " pod="openshift-keda/keda-admission-cf49989db-nn2zl" Apr 16 20:16:33.830066 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.830026 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-certificates\") pod \"keda-operator-ffbb595cb-s9g96\" (UID: \"ed17ce31-2aab-456d-992c-884e0d92416e\") " pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:33.830352 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.830086 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b56kw\" (UniqueName: \"kubernetes.io/projected/3327532f-0ac5-4e45-9344-4cef9f84da7c-kube-api-access-b56kw\") pod \"keda-admission-cf49989db-nn2zl\" (UID: \"3327532f-0ac5-4e45-9344-4cef9f84da7c\") " pod="openshift-keda/keda-admission-cf49989db-nn2zl" Apr 16 20:16:33.830352 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.830239 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 20:16:33.830352 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.830264 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 20:16:33.830352 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.830277 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-s9g96: references non-existent secret key: ca.crt Apr 16 20:16:33.830352 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:33.830342 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-certificates podName:ed17ce31-2aab-456d-992c-884e0d92416e nodeName:}" failed. No retries permitted until 2026-04-16 20:16:34.830319418 +0000 UTC m=+282.491916887 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-certificates") pod "keda-operator-ffbb595cb-s9g96" (UID: "ed17ce31-2aab-456d-992c-884e0d92416e") : references non-existent secret key: ca.crt Apr 16 20:16:33.832486 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.832466 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3327532f-0ac5-4e45-9344-4cef9f84da7c-certificates\") pod \"keda-admission-cf49989db-nn2zl\" (UID: \"3327532f-0ac5-4e45-9344-4cef9f84da7c\") " pod="openshift-keda/keda-admission-cf49989db-nn2zl" Apr 16 20:16:33.842078 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.842054 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b56kw\" (UniqueName: \"kubernetes.io/projected/3327532f-0ac5-4e45-9344-4cef9f84da7c-kube-api-access-b56kw\") pod \"keda-admission-cf49989db-nn2zl\" (UID: \"3327532f-0ac5-4e45-9344-4cef9f84da7c\") " pod="openshift-keda/keda-admission-cf49989db-nn2zl" Apr 16 20:16:33.885825 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:33.885792 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-nn2zl" Apr 16 20:16:34.008033 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:34.008011 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-nn2zl"] Apr 16 20:16:34.009791 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:16:34.009755 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3327532f_0ac5_4e45_9344_4cef9f84da7c.slice/crio-74c2a6e484431b23b3116cad2124be4e4cf7070d368e0f0e8d2f2a64dee7983e WatchSource:0}: Error finding container 74c2a6e484431b23b3116cad2124be4e4cf7070d368e0f0e8d2f2a64dee7983e: Status 404 returned error can't find the container with id 74c2a6e484431b23b3116cad2124be4e4cf7070d368e0f0e8d2f2a64dee7983e Apr 16 20:16:34.032455 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:34.032429 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-certificates\") pod \"keda-metrics-apiserver-7c9f485588-59ztq\" (UID: \"c9a0b8b8-3160-4b21-a348-9fef7d101905\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:34.032591 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:34.032576 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 20:16:34.032632 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:34.032597 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 20:16:34.032632 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:34.032619 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq: references non-existent secret key: tls.crt Apr 16 20:16:34.032688 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:34.032675 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-certificates podName:c9a0b8b8-3160-4b21-a348-9fef7d101905 nodeName:}" failed. No retries permitted until 2026-04-16 20:16:35.032657246 +0000 UTC m=+282.694254725 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-certificates") pod "keda-metrics-apiserver-7c9f485588-59ztq" (UID: "c9a0b8b8-3160-4b21-a348-9fef7d101905") : references non-existent secret key: tls.crt Apr 16 20:16:34.839699 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:34.839666 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-certificates\") pod \"keda-operator-ffbb595cb-s9g96\" (UID: \"ed17ce31-2aab-456d-992c-884e0d92416e\") " pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:34.839883 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:34.839832 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 20:16:34.839883 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:34.839851 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 20:16:34.839883 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:34.839864 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-s9g96: references non-existent secret key: ca.crt Apr 16 20:16:34.840055 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:34.839924 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-certificates podName:ed17ce31-2aab-456d-992c-884e0d92416e nodeName:}" failed. No retries permitted until 2026-04-16 20:16:36.839906863 +0000 UTC m=+284.501504330 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-certificates") pod "keda-operator-ffbb595cb-s9g96" (UID: "ed17ce31-2aab-456d-992c-884e0d92416e") : references non-existent secret key: ca.crt Apr 16 20:16:34.873692 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:34.873654 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-nn2zl" event={"ID":"3327532f-0ac5-4e45-9344-4cef9f84da7c","Type":"ContainerStarted","Data":"74c2a6e484431b23b3116cad2124be4e4cf7070d368e0f0e8d2f2a64dee7983e"} Apr 16 20:16:35.040897 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:35.040857 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-certificates\") pod \"keda-metrics-apiserver-7c9f485588-59ztq\" (UID: \"c9a0b8b8-3160-4b21-a348-9fef7d101905\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:35.041345 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:35.041007 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 20:16:35.041345 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:35.041027 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 20:16:35.041345 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:35.041045 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq: references non-existent secret key: tls.crt Apr 16 20:16:35.041345 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:16:35.041107 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-certificates podName:c9a0b8b8-3160-4b21-a348-9fef7d101905 nodeName:}" failed. No retries permitted until 2026-04-16 20:16:37.04108908 +0000 UTC m=+284.702686552 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-certificates") pod "keda-metrics-apiserver-7c9f485588-59ztq" (UID: "c9a0b8b8-3160-4b21-a348-9fef7d101905") : references non-existent secret key: tls.crt Apr 16 20:16:35.877834 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:35.877741 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-nn2zl" event={"ID":"3327532f-0ac5-4e45-9344-4cef9f84da7c","Type":"ContainerStarted","Data":"e02f204b456abdf4b9ac723ce4924708e31c401873eabccfe5dd11df29ce69ee"} Apr 16 20:16:35.877976 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:35.877859 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-nn2zl" Apr 16 20:16:35.895291 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:35.895234 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-nn2zl" podStartSLOduration=1.402881862 podStartE2EDuration="2.895193314s" podCreationTimestamp="2026-04-16 20:16:33 +0000 UTC" firstStartedPulling="2026-04-16 20:16:34.011022163 +0000 UTC m=+281.672619631" lastFinishedPulling="2026-04-16 20:16:35.503333612 +0000 UTC m=+283.164931083" observedRunningTime="2026-04-16 20:16:35.893927856 +0000 UTC m=+283.555525344" watchObservedRunningTime="2026-04-16 20:16:35.895193314 +0000 UTC m=+283.556790810" Apr 16 20:16:36.856499 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:36.856466 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-certificates\") pod \"keda-operator-ffbb595cb-s9g96\" (UID: \"ed17ce31-2aab-456d-992c-884e0d92416e\") " pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:36.858852 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:36.858829 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ed17ce31-2aab-456d-992c-884e0d92416e-certificates\") pod \"keda-operator-ffbb595cb-s9g96\" (UID: \"ed17ce31-2aab-456d-992c-884e0d92416e\") " pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:36.959258 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:36.959211 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:37.059179 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:37.059135 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-certificates\") pod \"keda-metrics-apiserver-7c9f485588-59ztq\" (UID: \"c9a0b8b8-3160-4b21-a348-9fef7d101905\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:37.061987 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:37.061965 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c9a0b8b8-3160-4b21-a348-9fef7d101905-certificates\") pod \"keda-metrics-apiserver-7c9f485588-59ztq\" (UID: \"c9a0b8b8-3160-4b21-a348-9fef7d101905\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:37.107245 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:37.107196 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-s9g96"] Apr 16 20:16:37.108791 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:16:37.108760 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded17ce31_2aab_456d_992c_884e0d92416e.slice/crio-aeacb7173b23ccee3efe4347e41b6471b8557f16a9b0ac3506d541fc5829b6ad WatchSource:0}: Error finding container aeacb7173b23ccee3efe4347e41b6471b8557f16a9b0ac3506d541fc5829b6ad: Status 404 returned error can't find the container with id aeacb7173b23ccee3efe4347e41b6471b8557f16a9b0ac3506d541fc5829b6ad Apr 16 20:16:37.162097 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:37.162065 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:37.278413 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:37.278356 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq"] Apr 16 20:16:37.280857 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:16:37.280827 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a0b8b8_3160_4b21_a348_9fef7d101905.slice/crio-7d44616b1d292b05b1b98dc70b40ac958f8d5fb867f0169668268c30a03d64ea WatchSource:0}: Error finding container 7d44616b1d292b05b1b98dc70b40ac958f8d5fb867f0169668268c30a03d64ea: Status 404 returned error can't find the container with id 7d44616b1d292b05b1b98dc70b40ac958f8d5fb867f0169668268c30a03d64ea Apr 16 20:16:37.885641 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:37.885601 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" event={"ID":"c9a0b8b8-3160-4b21-a348-9fef7d101905","Type":"ContainerStarted","Data":"7d44616b1d292b05b1b98dc70b40ac958f8d5fb867f0169668268c30a03d64ea"} Apr 16 20:16:37.886850 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:37.886822 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-s9g96" event={"ID":"ed17ce31-2aab-456d-992c-884e0d92416e","Type":"ContainerStarted","Data":"aeacb7173b23ccee3efe4347e41b6471b8557f16a9b0ac3506d541fc5829b6ad"} Apr 16 20:16:41.903109 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:41.903071 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-s9g96" event={"ID":"ed17ce31-2aab-456d-992c-884e0d92416e","Type":"ContainerStarted","Data":"f5f424566e203469b442b0023fb974ed2c4ccedb258c15b837436b40b4974ae3"} Apr 16 20:16:41.903597 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:41.903134 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:16:41.904312 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:41.904289 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" event={"ID":"c9a0b8b8-3160-4b21-a348-9fef7d101905","Type":"ContainerStarted","Data":"1408a47dd632d70151f8c0ddfca18e4e3f80db5b06c658522dc2a897935dc700"} Apr 16 20:16:41.904447 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:41.904436 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:41.919429 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:41.919381 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-s9g96" podStartSLOduration=5.002646858 podStartE2EDuration="8.919367837s" podCreationTimestamp="2026-04-16 20:16:33 +0000 UTC" firstStartedPulling="2026-04-16 20:16:37.110287059 +0000 UTC m=+284.771884533" lastFinishedPulling="2026-04-16 20:16:41.027008045 +0000 UTC m=+288.688605512" observedRunningTime="2026-04-16 20:16:41.919280873 +0000 UTC m=+289.580878363" watchObservedRunningTime="2026-04-16 20:16:41.919367837 +0000 UTC m=+289.580965325" Apr 16 20:16:41.939847 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:41.939801 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" podStartSLOduration=5.200393076 podStartE2EDuration="8.939785437s" podCreationTimestamp="2026-04-16 20:16:33 +0000 UTC" firstStartedPulling="2026-04-16 20:16:37.282049371 +0000 UTC m=+284.943646838" lastFinishedPulling="2026-04-16 20:16:41.021441732 +0000 UTC m=+288.683039199" observedRunningTime="2026-04-16 20:16:41.93861604 +0000 UTC m=+289.600213528" watchObservedRunningTime="2026-04-16 20:16:41.939785437 +0000 UTC m=+289.601382928" Apr 16 20:16:52.847764 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:52.847736 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:16:52.912076 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:52.911912 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-59ztq" Apr 16 20:16:53.870539 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:53.870509 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5mkhj" Apr 16 20:16:56.882983 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:16:56.882952 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-nn2zl" Apr 16 20:17:02.910053 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:02.910025 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-s9g96" Apr 16 20:17:42.247032 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.246997 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-j8fd9"] Apr 16 20:17:42.249909 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.249888 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" Apr 16 20:17:42.254570 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.254544 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 20:17:42.256232 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.256026 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-bhcgq\"" Apr 16 20:17:42.256232 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.256086 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 20:17:42.256232 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.256033 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 20:17:42.258512 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.258488 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8"] Apr 16 20:17:42.260448 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.260426 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" Apr 16 20:17:42.262819 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.262799 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 20:17:42.263352 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.263169 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-4pzd8\"" Apr 16 20:17:42.268269 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.266535 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-j8fd9"] Apr 16 20:17:42.272961 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.272937 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8"] Apr 16 20:17:42.285410 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.285378 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqv5s\" (UniqueName: \"kubernetes.io/projected/f18acdb1-e8a1-48b6-a585-ac48f51dca0d-kube-api-access-xqv5s\") pod \"llmisvc-controller-manager-68cc5db7c4-2zvw8\" (UID: \"f18acdb1-e8a1-48b6-a585-ac48f51dca0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" Apr 16 20:17:42.285694 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.285674 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18acdb1-e8a1-48b6-a585-ac48f51dca0d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2zvw8\" (UID: \"f18acdb1-e8a1-48b6-a585-ac48f51dca0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" Apr 16 20:17:42.285823 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.285809 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks9tw\" (UniqueName: \"kubernetes.io/projected/d3172101-75d7-4aa0-8cd7-e180b5c1a449-kube-api-access-ks9tw\") pod \"kserve-controller-manager-66cf78b85b-j8fd9\" (UID: \"d3172101-75d7-4aa0-8cd7-e180b5c1a449\") " pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" Apr 16 20:17:42.285948 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.285934 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3172101-75d7-4aa0-8cd7-e180b5c1a449-cert\") pod \"kserve-controller-manager-66cf78b85b-j8fd9\" (UID: \"d3172101-75d7-4aa0-8cd7-e180b5c1a449\") " pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" Apr 16 20:17:42.386715 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.386676 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18acdb1-e8a1-48b6-a585-ac48f51dca0d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2zvw8\" (UID: \"f18acdb1-e8a1-48b6-a585-ac48f51dca0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" Apr 16 20:17:42.386715 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.386719 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ks9tw\" (UniqueName: \"kubernetes.io/projected/d3172101-75d7-4aa0-8cd7-e180b5c1a449-kube-api-access-ks9tw\") pod \"kserve-controller-manager-66cf78b85b-j8fd9\" (UID: \"d3172101-75d7-4aa0-8cd7-e180b5c1a449\") " pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" Apr 16 20:17:42.386987 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.386748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3172101-75d7-4aa0-8cd7-e180b5c1a449-cert\") pod \"kserve-controller-manager-66cf78b85b-j8fd9\" (UID: \"d3172101-75d7-4aa0-8cd7-e180b5c1a449\") " pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" Apr 16 20:17:42.386987 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.386776 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqv5s\" (UniqueName: \"kubernetes.io/projected/f18acdb1-e8a1-48b6-a585-ac48f51dca0d-kube-api-access-xqv5s\") pod \"llmisvc-controller-manager-68cc5db7c4-2zvw8\" (UID: \"f18acdb1-e8a1-48b6-a585-ac48f51dca0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" Apr 16 20:17:42.386987 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:17:42.386825 2569 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 20:17:42.386987 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:17:42.386903 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f18acdb1-e8a1-48b6-a585-ac48f51dca0d-cert podName:f18acdb1-e8a1-48b6-a585-ac48f51dca0d nodeName:}" failed. No retries permitted until 2026-04-16 20:17:42.886882389 +0000 UTC m=+350.548479869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f18acdb1-e8a1-48b6-a585-ac48f51dca0d-cert") pod "llmisvc-controller-manager-68cc5db7c4-2zvw8" (UID: "f18acdb1-e8a1-48b6-a585-ac48f51dca0d") : secret "llmisvc-webhook-server-cert" not found Apr 16 20:17:42.389198 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.389179 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3172101-75d7-4aa0-8cd7-e180b5c1a449-cert\") pod \"kserve-controller-manager-66cf78b85b-j8fd9\" (UID: \"d3172101-75d7-4aa0-8cd7-e180b5c1a449\") " pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" Apr 16 20:17:42.396273 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.396242 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqv5s\" (UniqueName: \"kubernetes.io/projected/f18acdb1-e8a1-48b6-a585-ac48f51dca0d-kube-api-access-xqv5s\") pod \"llmisvc-controller-manager-68cc5db7c4-2zvw8\" (UID: \"f18acdb1-e8a1-48b6-a585-ac48f51dca0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" Apr 16 20:17:42.396371 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.396349 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks9tw\" (UniqueName: \"kubernetes.io/projected/d3172101-75d7-4aa0-8cd7-e180b5c1a449-kube-api-access-ks9tw\") pod \"kserve-controller-manager-66cf78b85b-j8fd9\" (UID: \"d3172101-75d7-4aa0-8cd7-e180b5c1a449\") " pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" Apr 16 20:17:42.560771 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.560669 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" Apr 16 20:17:42.680178 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.680152 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-j8fd9"] Apr 16 20:17:42.682915 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:17:42.682885 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3172101_75d7_4aa0_8cd7_e180b5c1a449.slice/crio-e6411f5eb34f64656df4318c20def5ab223270789012a234df67c568832b7cb0 WatchSource:0}: Error finding container e6411f5eb34f64656df4318c20def5ab223270789012a234df67c568832b7cb0: Status 404 returned error can't find the container with id e6411f5eb34f64656df4318c20def5ab223270789012a234df67c568832b7cb0 Apr 16 20:17:42.684078 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.684056 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:17:42.891852 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.891771 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18acdb1-e8a1-48b6-a585-ac48f51dca0d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2zvw8\" (UID: \"f18acdb1-e8a1-48b6-a585-ac48f51dca0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" Apr 16 20:17:42.894093 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:42.894074 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18acdb1-e8a1-48b6-a585-ac48f51dca0d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2zvw8\" (UID: \"f18acdb1-e8a1-48b6-a585-ac48f51dca0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" Apr 16 20:17:43.083425 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:43.083391 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" event={"ID":"d3172101-75d7-4aa0-8cd7-e180b5c1a449","Type":"ContainerStarted","Data":"e6411f5eb34f64656df4318c20def5ab223270789012a234df67c568832b7cb0"} Apr 16 20:17:43.177814 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:43.177716 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" Apr 16 20:17:43.299240 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:43.299186 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8"] Apr 16 20:17:43.301286 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:17:43.301259 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf18acdb1_e8a1_48b6_a585_ac48f51dca0d.slice/crio-9d849eebe7ed358d55f66f1aa1d4c6a9a431488df839a5896d4c83a18b79ebd4 WatchSource:0}: Error finding container 9d849eebe7ed358d55f66f1aa1d4c6a9a431488df839a5896d4c83a18b79ebd4: Status 404 returned error can't find the container with id 9d849eebe7ed358d55f66f1aa1d4c6a9a431488df839a5896d4c83a18b79ebd4 Apr 16 20:17:44.088666 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:44.088621 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" event={"ID":"f18acdb1-e8a1-48b6-a585-ac48f51dca0d","Type":"ContainerStarted","Data":"9d849eebe7ed358d55f66f1aa1d4c6a9a431488df839a5896d4c83a18b79ebd4"} Apr 16 20:17:47.100107 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:47.100067 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" event={"ID":"d3172101-75d7-4aa0-8cd7-e180b5c1a449","Type":"ContainerStarted","Data":"0058f523f1459bdadb2fd2dc3136682cb4f97a4397c6ba746e411212351c82ed"} Apr 16 20:17:47.100573 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:47.100114 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" Apr 16 20:17:47.101448 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:47.101428 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" event={"ID":"f18acdb1-e8a1-48b6-a585-ac48f51dca0d","Type":"ContainerStarted","Data":"2c44c32734b54f2bf34c70337a4ae1cfc4c6cc4c896f40ef6f13f0f1e1019c9b"} Apr 16 20:17:47.101574 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:47.101560 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" Apr 16 20:17:47.122073 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:47.122022 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" podStartSLOduration=2.114803651 podStartE2EDuration="5.122005792s" podCreationTimestamp="2026-04-16 20:17:42 +0000 UTC" firstStartedPulling="2026-04-16 20:17:42.684256767 +0000 UTC m=+350.345854234" lastFinishedPulling="2026-04-16 20:17:45.691458895 +0000 UTC m=+353.353056375" observedRunningTime="2026-04-16 20:17:47.121138419 +0000 UTC m=+354.782735909" watchObservedRunningTime="2026-04-16 20:17:47.122005792 +0000 UTC m=+354.783603280" Apr 16 20:17:47.138612 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:17:47.138562 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" podStartSLOduration=2.147028947 podStartE2EDuration="5.138548117s" podCreationTimestamp="2026-04-16 20:17:42 +0000 UTC" firstStartedPulling="2026-04-16 20:17:43.305264456 +0000 UTC m=+350.966861923" lastFinishedPulling="2026-04-16 20:17:46.296783627 +0000 UTC m=+353.958381093" observedRunningTime="2026-04-16 20:17:47.136739202 +0000 UTC m=+354.798336691" watchObservedRunningTime="2026-04-16 20:17:47.138548117 +0000 UTC m=+354.800145605" Apr 16 20:18:18.106868 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:18.106837 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zvw8" Apr 16 20:18:18.109758 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:18.109738 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" Apr 16 20:18:19.391111 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.391074 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-j8fd9"] Apr 16 20:18:19.391529 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.391342 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" podUID="d3172101-75d7-4aa0-8cd7-e180b5c1a449" containerName="manager" containerID="cri-o://0058f523f1459bdadb2fd2dc3136682cb4f97a4397c6ba746e411212351c82ed" gracePeriod=10 Apr 16 20:18:19.413476 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.413448 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-7bv8g"] Apr 16 20:18:19.467345 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.467319 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-7bv8g"] Apr 16 20:18:19.467442 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.467431 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-7bv8g" Apr 16 20:18:19.607462 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.607432 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d068de2e-40fd-4c8d-9e8c-c68a8cb12247-cert\") pod \"kserve-controller-manager-66cf78b85b-7bv8g\" (UID: \"d068de2e-40fd-4c8d-9e8c-c68a8cb12247\") " pod="kserve/kserve-controller-manager-66cf78b85b-7bv8g" Apr 16 20:18:19.607611 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.607496 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5qc6\" (UniqueName: \"kubernetes.io/projected/d068de2e-40fd-4c8d-9e8c-c68a8cb12247-kube-api-access-n5qc6\") pod \"kserve-controller-manager-66cf78b85b-7bv8g\" (UID: \"d068de2e-40fd-4c8d-9e8c-c68a8cb12247\") " pod="kserve/kserve-controller-manager-66cf78b85b-7bv8g" Apr 16 20:18:19.656384 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.656356 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" Apr 16 20:18:19.708743 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.708703 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5qc6\" (UniqueName: \"kubernetes.io/projected/d068de2e-40fd-4c8d-9e8c-c68a8cb12247-kube-api-access-n5qc6\") pod \"kserve-controller-manager-66cf78b85b-7bv8g\" (UID: \"d068de2e-40fd-4c8d-9e8c-c68a8cb12247\") " pod="kserve/kserve-controller-manager-66cf78b85b-7bv8g" Apr 16 20:18:19.708913 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.708793 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d068de2e-40fd-4c8d-9e8c-c68a8cb12247-cert\") pod \"kserve-controller-manager-66cf78b85b-7bv8g\" (UID: \"d068de2e-40fd-4c8d-9e8c-c68a8cb12247\") " pod="kserve/kserve-controller-manager-66cf78b85b-7bv8g" Apr 16 20:18:19.711083 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.711059 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d068de2e-40fd-4c8d-9e8c-c68a8cb12247-cert\") pod \"kserve-controller-manager-66cf78b85b-7bv8g\" (UID: \"d068de2e-40fd-4c8d-9e8c-c68a8cb12247\") " pod="kserve/kserve-controller-manager-66cf78b85b-7bv8g" Apr 16 20:18:19.716620 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.716576 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5qc6\" (UniqueName: \"kubernetes.io/projected/d068de2e-40fd-4c8d-9e8c-c68a8cb12247-kube-api-access-n5qc6\") pod \"kserve-controller-manager-66cf78b85b-7bv8g\" (UID: \"d068de2e-40fd-4c8d-9e8c-c68a8cb12247\") " pod="kserve/kserve-controller-manager-66cf78b85b-7bv8g" Apr 16 20:18:19.809940 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.809904 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3172101-75d7-4aa0-8cd7-e180b5c1a449-cert\") pod \"d3172101-75d7-4aa0-8cd7-e180b5c1a449\" (UID: \"d3172101-75d7-4aa0-8cd7-e180b5c1a449\") " Apr 16 20:18:19.810115 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.809965 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks9tw\" (UniqueName: \"kubernetes.io/projected/d3172101-75d7-4aa0-8cd7-e180b5c1a449-kube-api-access-ks9tw\") pod \"d3172101-75d7-4aa0-8cd7-e180b5c1a449\" (UID: \"d3172101-75d7-4aa0-8cd7-e180b5c1a449\") " Apr 16 20:18:19.812078 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.812052 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3172101-75d7-4aa0-8cd7-e180b5c1a449-cert" (OuterVolumeSpecName: "cert") pod "d3172101-75d7-4aa0-8cd7-e180b5c1a449" (UID: "d3172101-75d7-4aa0-8cd7-e180b5c1a449"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:18:19.812173 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.812077 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-7bv8g" Apr 16 20:18:19.812226 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.812074 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3172101-75d7-4aa0-8cd7-e180b5c1a449-kube-api-access-ks9tw" (OuterVolumeSpecName: "kube-api-access-ks9tw") pod "d3172101-75d7-4aa0-8cd7-e180b5c1a449" (UID: "d3172101-75d7-4aa0-8cd7-e180b5c1a449"). InnerVolumeSpecName "kube-api-access-ks9tw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:18:19.910904 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.910818 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ks9tw\" (UniqueName: \"kubernetes.io/projected/d3172101-75d7-4aa0-8cd7-e180b5c1a449-kube-api-access-ks9tw\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:18:19.910904 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.910855 2569 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3172101-75d7-4aa0-8cd7-e180b5c1a449-cert\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:18:19.928438 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:19.928414 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-7bv8g"] Apr 16 20:18:19.930870 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:18:19.930839 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd068de2e_40fd_4c8d_9e8c_c68a8cb12247.slice/crio-769117b3acd57d323f3496c3cc63c705c60ff3aed240079f9226f3ee064455f4 WatchSource:0}: Error finding container 769117b3acd57d323f3496c3cc63c705c60ff3aed240079f9226f3ee064455f4: Status 404 returned error can't find the container with id 769117b3acd57d323f3496c3cc63c705c60ff3aed240079f9226f3ee064455f4 Apr 16 20:18:20.205359 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:20.205272 2569 generic.go:358] "Generic (PLEG): container finished" podID="d3172101-75d7-4aa0-8cd7-e180b5c1a449" containerID="0058f523f1459bdadb2fd2dc3136682cb4f97a4397c6ba746e411212351c82ed" exitCode=0 Apr 16 20:18:20.205359 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:20.205332 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" Apr 16 20:18:20.205359 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:20.205345 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" event={"ID":"d3172101-75d7-4aa0-8cd7-e180b5c1a449","Type":"ContainerDied","Data":"0058f523f1459bdadb2fd2dc3136682cb4f97a4397c6ba746e411212351c82ed"} Apr 16 20:18:20.205693 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:20.205375 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-j8fd9" event={"ID":"d3172101-75d7-4aa0-8cd7-e180b5c1a449","Type":"ContainerDied","Data":"e6411f5eb34f64656df4318c20def5ab223270789012a234df67c568832b7cb0"} Apr 16 20:18:20.205693 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:20.205390 2569 scope.go:117] "RemoveContainer" containerID="0058f523f1459bdadb2fd2dc3136682cb4f97a4397c6ba746e411212351c82ed" Apr 16 20:18:20.206449 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:20.206419 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-7bv8g" event={"ID":"d068de2e-40fd-4c8d-9e8c-c68a8cb12247","Type":"ContainerStarted","Data":"769117b3acd57d323f3496c3cc63c705c60ff3aed240079f9226f3ee064455f4"} Apr 16 20:18:20.213743 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:20.213723 2569 scope.go:117] "RemoveContainer" containerID="0058f523f1459bdadb2fd2dc3136682cb4f97a4397c6ba746e411212351c82ed" Apr 16 20:18:20.213988 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:18:20.213967 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0058f523f1459bdadb2fd2dc3136682cb4f97a4397c6ba746e411212351c82ed\": container with ID starting with 0058f523f1459bdadb2fd2dc3136682cb4f97a4397c6ba746e411212351c82ed not found: ID does not exist" containerID="0058f523f1459bdadb2fd2dc3136682cb4f97a4397c6ba746e411212351c82ed" Apr 16 20:18:20.214056 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:20.214000 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0058f523f1459bdadb2fd2dc3136682cb4f97a4397c6ba746e411212351c82ed"} err="failed to get container status \"0058f523f1459bdadb2fd2dc3136682cb4f97a4397c6ba746e411212351c82ed\": rpc error: code = NotFound desc = could not find container \"0058f523f1459bdadb2fd2dc3136682cb4f97a4397c6ba746e411212351c82ed\": container with ID starting with 0058f523f1459bdadb2fd2dc3136682cb4f97a4397c6ba746e411212351c82ed not found: ID does not exist" Apr 16 20:18:20.226075 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:20.226042 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-j8fd9"] Apr 16 20:18:20.229050 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:20.229027 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-j8fd9"] Apr 16 20:18:20.949712 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:20.949681 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3172101-75d7-4aa0-8cd7-e180b5c1a449" path="/var/lib/kubelet/pods/d3172101-75d7-4aa0-8cd7-e180b5c1a449/volumes" Apr 16 20:18:21.211846 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:21.211768 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-7bv8g" event={"ID":"d068de2e-40fd-4c8d-9e8c-c68a8cb12247","Type":"ContainerStarted","Data":"e65fe4550616e860b1b9ad088eaf3550516425309d0bd633a8c97981e3c2cdda"} Apr 16 20:18:21.211846 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:21.211813 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-7bv8g" Apr 16 20:18:21.227884 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:21.227793 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-7bv8g" podStartSLOduration=1.6385375039999999 podStartE2EDuration="2.227774243s" podCreationTimestamp="2026-04-16 20:18:19 +0000 UTC" firstStartedPulling="2026-04-16 20:18:19.932185764 +0000 UTC m=+387.593783230" lastFinishedPulling="2026-04-16 20:18:20.521422499 +0000 UTC m=+388.183019969" observedRunningTime="2026-04-16 20:18:21.226892715 +0000 UTC m=+388.888490203" watchObservedRunningTime="2026-04-16 20:18:21.227774243 +0000 UTC m=+388.889371736" Apr 16 20:18:52.220381 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:18:52.220348 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-7bv8g" Apr 16 20:19:22.210686 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.210605 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-778bf4cfc7-d9xn5"] Apr 16 20:19:22.211102 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.210978 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3172101-75d7-4aa0-8cd7-e180b5c1a449" containerName="manager" Apr 16 20:19:22.211102 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.210991 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3172101-75d7-4aa0-8cd7-e180b5c1a449" containerName="manager" Apr 16 20:19:22.211102 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.211051 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3172101-75d7-4aa0-8cd7-e180b5c1a449" containerName="manager" Apr 16 20:19:22.213873 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.213855 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.223286 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.223261 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-778bf4cfc7-d9xn5"] Apr 16 20:19:22.341120 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.341079 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a861d4fc-8efc-499b-81fb-b1101b20ea23-console-oauth-config\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.341349 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.341137 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a861d4fc-8efc-499b-81fb-b1101b20ea23-service-ca\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.341349 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.341182 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a861d4fc-8efc-499b-81fb-b1101b20ea23-trusted-ca-bundle\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.341349 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.341246 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a861d4fc-8efc-499b-81fb-b1101b20ea23-console-config\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.341349 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.341271 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a861d4fc-8efc-499b-81fb-b1101b20ea23-console-serving-cert\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.341349 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.341303 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a861d4fc-8efc-499b-81fb-b1101b20ea23-oauth-serving-cert\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.341553 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.341360 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v89q5\" (UniqueName: \"kubernetes.io/projected/a861d4fc-8efc-499b-81fb-b1101b20ea23-kube-api-access-v89q5\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.442762 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.442720 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a861d4fc-8efc-499b-81fb-b1101b20ea23-trusted-ca-bundle\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.442955 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.442775 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a861d4fc-8efc-499b-81fb-b1101b20ea23-console-config\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.442955 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.442804 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a861d4fc-8efc-499b-81fb-b1101b20ea23-console-serving-cert\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.442955 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.442833 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a861d4fc-8efc-499b-81fb-b1101b20ea23-oauth-serving-cert\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.442955 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.442872 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v89q5\" (UniqueName: \"kubernetes.io/projected/a861d4fc-8efc-499b-81fb-b1101b20ea23-kube-api-access-v89q5\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.442955 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.442912 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a861d4fc-8efc-499b-81fb-b1101b20ea23-console-oauth-config\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.442955 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.442948 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a861d4fc-8efc-499b-81fb-b1101b20ea23-service-ca\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.443664 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.443640 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a861d4fc-8efc-499b-81fb-b1101b20ea23-service-ca\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.443751 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.443708 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a861d4fc-8efc-499b-81fb-b1101b20ea23-oauth-serving-cert\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.443751 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.443736 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a861d4fc-8efc-499b-81fb-b1101b20ea23-trusted-ca-bundle\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.444032 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.444017 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a861d4fc-8efc-499b-81fb-b1101b20ea23-console-config\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.445247 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.445229 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a861d4fc-8efc-499b-81fb-b1101b20ea23-console-oauth-config\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.445437 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.445421 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a861d4fc-8efc-499b-81fb-b1101b20ea23-console-serving-cert\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.452328 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.452298 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v89q5\" (UniqueName: \"kubernetes.io/projected/a861d4fc-8efc-499b-81fb-b1101b20ea23-kube-api-access-v89q5\") pod \"console-778bf4cfc7-d9xn5\" (UID: \"a861d4fc-8efc-499b-81fb-b1101b20ea23\") " pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.525168 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.525130 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:22.644940 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:22.644813 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-778bf4cfc7-d9xn5"] Apr 16 20:19:22.647733 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:19:22.647709 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda861d4fc_8efc_499b_81fb_b1101b20ea23.slice/crio-184aa82f5cdb2fc64dda38f10be0864f0e5c42f9113f4f48952a3c87bcfa4231 WatchSource:0}: Error finding container 184aa82f5cdb2fc64dda38f10be0864f0e5c42f9113f4f48952a3c87bcfa4231: Status 404 returned error can't find the container with id 184aa82f5cdb2fc64dda38f10be0864f0e5c42f9113f4f48952a3c87bcfa4231 Apr 16 20:19:23.406512 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:23.406473 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-778bf4cfc7-d9xn5" event={"ID":"a861d4fc-8efc-499b-81fb-b1101b20ea23","Type":"ContainerStarted","Data":"98333100bd74a91773ee70f5cc10331267f5b449bf2f44ea849ffb9adab2dcc2"} Apr 16 20:19:23.406512 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:23.406515 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-778bf4cfc7-d9xn5" event={"ID":"a861d4fc-8efc-499b-81fb-b1101b20ea23","Type":"ContainerStarted","Data":"184aa82f5cdb2fc64dda38f10be0864f0e5c42f9113f4f48952a3c87bcfa4231"} Apr 16 20:19:23.425070 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:23.425025 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-778bf4cfc7-d9xn5" podStartSLOduration=1.4250131 podStartE2EDuration="1.4250131s" podCreationTimestamp="2026-04-16 20:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:19:23.423542899 +0000 UTC m=+451.085140389" watchObservedRunningTime="2026-04-16 20:19:23.4250131 +0000 UTC m=+451.086610588" Apr 16 20:19:28.986141 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:28.986107 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4"] Apr 16 20:19:28.989684 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:28.989663 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" Apr 16 20:19:28.991963 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:28.991939 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5hj24\"" Apr 16 20:19:29.000430 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:29.000399 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4"] Apr 16 20:19:29.007977 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:29.007948 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" Apr 16 20:19:29.079419 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:29.079387 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf"] Apr 16 20:19:29.084175 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:29.084148 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" Apr 16 20:19:29.090071 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:29.090030 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf"] Apr 16 20:19:29.097177 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:29.097071 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" Apr 16 20:19:29.170512 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:29.170141 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4"] Apr 16 20:19:29.176030 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:19:29.175784 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd204361_ed3b_49d5_b44b_647296fb6f74.slice/crio-e9350ff1aea600d3e6607002988f7e2ebae2f8c031ae60e99a1cf114089a0d67 WatchSource:0}: Error finding container e9350ff1aea600d3e6607002988f7e2ebae2f8c031ae60e99a1cf114089a0d67: Status 404 returned error can't find the container with id e9350ff1aea600d3e6607002988f7e2ebae2f8c031ae60e99a1cf114089a0d67 Apr 16 20:19:29.246954 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:29.246929 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf"] Apr 16 20:19:29.248574 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:19:29.248545 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1435886_d743_48a5_8fb6_db52d1a758a0.slice/crio-dc12fb0c64f31588290fbcd88f914e2ae51a4df2e5fafdf38f381e5e28d956cd WatchSource:0}: Error finding container dc12fb0c64f31588290fbcd88f914e2ae51a4df2e5fafdf38f381e5e28d956cd: Status 404 returned error can't find the container with id dc12fb0c64f31588290fbcd88f914e2ae51a4df2e5fafdf38f381e5e28d956cd Apr 16 20:19:29.425913 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:29.425874 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" event={"ID":"c1435886-d743-48a5-8fb6-db52d1a758a0","Type":"ContainerStarted","Data":"dc12fb0c64f31588290fbcd88f914e2ae51a4df2e5fafdf38f381e5e28d956cd"} Apr 16 20:19:29.426988 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:29.426962 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" event={"ID":"bd204361-ed3b-49d5-b44b-647296fb6f74","Type":"ContainerStarted","Data":"e9350ff1aea600d3e6607002988f7e2ebae2f8c031ae60e99a1cf114089a0d67"} Apr 16 20:19:32.527159 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:32.526124 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:32.527159 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:32.527002 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:32.534130 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:32.533924 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:33.461253 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:33.461199 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-778bf4cfc7-d9xn5" Apr 16 20:19:33.516475 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:33.516442 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dbb4d86bc-vm9w8"] Apr 16 20:19:43.494930 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:43.494897 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" event={"ID":"c1435886-d743-48a5-8fb6-db52d1a758a0","Type":"ContainerStarted","Data":"9caa27287540db85c24593bdf7f1517cbde9205e22bc9a5fd3737eac00d8eb15"} Apr 16 20:19:43.495453 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:43.495068 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" Apr 16 20:19:43.496370 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:43.496342 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" event={"ID":"bd204361-ed3b-49d5-b44b-647296fb6f74","Type":"ContainerStarted","Data":"3e26860bf6b166bb8e20baa9e7d468d7abb610cf9a5d4536b1125095c446cf81"} Apr 16 20:19:43.496577 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:43.496560 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" Apr 16 20:19:43.496815 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:43.496768 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" podUID="c1435886-d743-48a5-8fb6-db52d1a758a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 20:19:43.497677 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:43.497654 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" podUID="bd204361-ed3b-49d5-b44b-647296fb6f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 20:19:43.510977 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:43.510936 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" podStartSLOduration=0.493818666 podStartE2EDuration="14.510921239s" podCreationTimestamp="2026-04-16 20:19:29 +0000 UTC" firstStartedPulling="2026-04-16 20:19:29.250318977 +0000 UTC m=+456.911916444" lastFinishedPulling="2026-04-16 20:19:43.267421549 +0000 UTC m=+470.929019017" observedRunningTime="2026-04-16 20:19:43.510191519 +0000 UTC m=+471.171789010" watchObservedRunningTime="2026-04-16 20:19:43.510921239 +0000 UTC m=+471.172518732" Apr 16 20:19:43.525242 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:43.525147 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" podStartSLOduration=1.431212859 podStartE2EDuration="15.525134356s" podCreationTimestamp="2026-04-16 20:19:28 +0000 UTC" firstStartedPulling="2026-04-16 20:19:29.177373612 +0000 UTC m=+456.838971086" lastFinishedPulling="2026-04-16 20:19:43.271295096 +0000 UTC m=+470.932892583" observedRunningTime="2026-04-16 20:19:43.524049521 +0000 UTC m=+471.185647013" watchObservedRunningTime="2026-04-16 20:19:43.525134356 +0000 UTC m=+471.186731845" Apr 16 20:19:44.499668 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:44.499631 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" podUID="bd204361-ed3b-49d5-b44b-647296fb6f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 20:19:44.500048 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:44.499635 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" podUID="c1435886-d743-48a5-8fb6-db52d1a758a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 20:19:54.500463 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:54.500418 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" podUID="c1435886-d743-48a5-8fb6-db52d1a758a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 20:19:54.500950 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:54.500419 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" podUID="bd204361-ed3b-49d5-b44b-647296fb6f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 20:19:58.543208 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.543142 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6dbb4d86bc-vm9w8" podUID="f6ad24f9-fa9c-40be-921a-979c72985ebc" containerName="console" containerID="cri-o://92b846bad142bc043291be42fb4b9d2f98c58be675a671dd553f250092cd4cc0" gracePeriod=15 Apr 16 20:19:58.696087 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.696049 2569 patch_prober.go:28] interesting pod/console-6dbb4d86bc-vm9w8 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.16:8443/health\": dial tcp 10.134.0.16:8443: connect: connection refused" start-of-body= Apr 16 20:19:58.696288 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.696125 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-6dbb4d86bc-vm9w8" podUID="f6ad24f9-fa9c-40be-921a-979c72985ebc" containerName="console" probeResult="failure" output="Get \"https://10.134.0.16:8443/health\": dial tcp 10.134.0.16:8443: connect: connection refused" Apr 16 20:19:58.786902 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.786872 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dbb4d86bc-vm9w8_f6ad24f9-fa9c-40be-921a-979c72985ebc/console/0.log" Apr 16 20:19:58.787036 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.786940 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:19:58.866127 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.866044 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-oauth-config\") pod \"f6ad24f9-fa9c-40be-921a-979c72985ebc\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " Apr 16 20:19:58.866127 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.866115 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-serving-cert\") pod \"f6ad24f9-fa9c-40be-921a-979c72985ebc\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " Apr 16 20:19:58.866344 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.866303 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-service-ca\") pod \"f6ad24f9-fa9c-40be-921a-979c72985ebc\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " Apr 16 20:19:58.866404 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.866370 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-config\") pod \"f6ad24f9-fa9c-40be-921a-979c72985ebc\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " Apr 16 20:19:58.866458 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.866405 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sdpj\" (UniqueName: \"kubernetes.io/projected/f6ad24f9-fa9c-40be-921a-979c72985ebc-kube-api-access-4sdpj\") pod \"f6ad24f9-fa9c-40be-921a-979c72985ebc\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " Apr 16 20:19:58.866458 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.866436 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-oauth-serving-cert\") pod \"f6ad24f9-fa9c-40be-921a-979c72985ebc\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " Apr 16 20:19:58.866561 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.866493 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-trusted-ca-bundle\") pod \"f6ad24f9-fa9c-40be-921a-979c72985ebc\" (UID: \"f6ad24f9-fa9c-40be-921a-979c72985ebc\") " Apr 16 20:19:58.866779 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.866703 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-service-ca" (OuterVolumeSpecName: "service-ca") pod "f6ad24f9-fa9c-40be-921a-979c72985ebc" (UID: "f6ad24f9-fa9c-40be-921a-979c72985ebc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:19:58.866957 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.866899 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-service-ca\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:19:58.867040 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.867017 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f6ad24f9-fa9c-40be-921a-979c72985ebc" (UID: "f6ad24f9-fa9c-40be-921a-979c72985ebc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:19:58.867102 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.866737 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-config" (OuterVolumeSpecName: "console-config") pod "f6ad24f9-fa9c-40be-921a-979c72985ebc" (UID: "f6ad24f9-fa9c-40be-921a-979c72985ebc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:19:58.867204 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.867179 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f6ad24f9-fa9c-40be-921a-979c72985ebc" (UID: "f6ad24f9-fa9c-40be-921a-979c72985ebc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:19:58.868921 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.868892 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6ad24f9-fa9c-40be-921a-979c72985ebc-kube-api-access-4sdpj" (OuterVolumeSpecName: "kube-api-access-4sdpj") pod "f6ad24f9-fa9c-40be-921a-979c72985ebc" (UID: "f6ad24f9-fa9c-40be-921a-979c72985ebc"). InnerVolumeSpecName "kube-api-access-4sdpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:19:58.869031 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.868937 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f6ad24f9-fa9c-40be-921a-979c72985ebc" (UID: "f6ad24f9-fa9c-40be-921a-979c72985ebc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:19:58.869031 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.868983 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f6ad24f9-fa9c-40be-921a-979c72985ebc" (UID: "f6ad24f9-fa9c-40be-921a-979c72985ebc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:19:58.967472 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.967442 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-oauth-config\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:19:58.967472 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.967468 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-serving-cert\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:19:58.967472 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.967477 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-console-config\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:19:58.967738 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.967488 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4sdpj\" (UniqueName: \"kubernetes.io/projected/f6ad24f9-fa9c-40be-921a-979c72985ebc-kube-api-access-4sdpj\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:19:58.967738 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.967499 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-oauth-serving-cert\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:19:58.967738 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:58.967508 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6ad24f9-fa9c-40be-921a-979c72985ebc-trusted-ca-bundle\") on node \"ip-10-0-135-182.ec2.internal\" DevicePath \"\"" Apr 16 20:19:59.549991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:59.549963 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dbb4d86bc-vm9w8_f6ad24f9-fa9c-40be-921a-979c72985ebc/console/0.log" Apr 16 20:19:59.550412 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:59.550007 2569 generic.go:358] "Generic (PLEG): container finished" podID="f6ad24f9-fa9c-40be-921a-979c72985ebc" containerID="92b846bad142bc043291be42fb4b9d2f98c58be675a671dd553f250092cd4cc0" exitCode=2 Apr 16 20:19:59.550412 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:59.550044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dbb4d86bc-vm9w8" event={"ID":"f6ad24f9-fa9c-40be-921a-979c72985ebc","Type":"ContainerDied","Data":"92b846bad142bc043291be42fb4b9d2f98c58be675a671dd553f250092cd4cc0"} Apr 16 20:19:59.550412 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:59.550080 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dbb4d86bc-vm9w8" Apr 16 20:19:59.550412 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:59.550090 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dbb4d86bc-vm9w8" event={"ID":"f6ad24f9-fa9c-40be-921a-979c72985ebc","Type":"ContainerDied","Data":"bb7a295935ab89d46892b0ebfc49b4acb8e7d50c37d95a5bb67ec281da9dfece"} Apr 16 20:19:59.550412 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:59.550113 2569 scope.go:117] "RemoveContainer" containerID="92b846bad142bc043291be42fb4b9d2f98c58be675a671dd553f250092cd4cc0" Apr 16 20:19:59.557731 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:59.557705 2569 scope.go:117] "RemoveContainer" containerID="92b846bad142bc043291be42fb4b9d2f98c58be675a671dd553f250092cd4cc0" Apr 16 20:19:59.557991 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:19:59.557973 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b846bad142bc043291be42fb4b9d2f98c58be675a671dd553f250092cd4cc0\": container with ID starting with 92b846bad142bc043291be42fb4b9d2f98c58be675a671dd553f250092cd4cc0 not found: ID does not exist" containerID="92b846bad142bc043291be42fb4b9d2f98c58be675a671dd553f250092cd4cc0" Apr 16 20:19:59.558054 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:59.558001 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b846bad142bc043291be42fb4b9d2f98c58be675a671dd553f250092cd4cc0"} err="failed to get container status \"92b846bad142bc043291be42fb4b9d2f98c58be675a671dd553f250092cd4cc0\": rpc error: code = NotFound desc = could not find container \"92b846bad142bc043291be42fb4b9d2f98c58be675a671dd553f250092cd4cc0\": container with ID starting with 92b846bad142bc043291be42fb4b9d2f98c58be675a671dd553f250092cd4cc0 not found: ID does not exist" Apr 16 20:19:59.570142 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:59.570099 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dbb4d86bc-vm9w8"] Apr 16 20:19:59.572490 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:19:59.572467 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6dbb4d86bc-vm9w8"] Apr 16 20:20:00.949673 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:20:00.949638 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6ad24f9-fa9c-40be-921a-979c72985ebc" path="/var/lib/kubelet/pods/f6ad24f9-fa9c-40be-921a-979c72985ebc/volumes" Apr 16 20:20:04.500046 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:20:04.500006 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" podUID="c1435886-d743-48a5-8fb6-db52d1a758a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 20:20:04.500457 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:20:04.500008 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" podUID="bd204361-ed3b-49d5-b44b-647296fb6f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 20:20:14.499919 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:20:14.499874 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" podUID="bd204361-ed3b-49d5-b44b-647296fb6f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 20:20:14.500400 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:20:14.499874 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" podUID="c1435886-d743-48a5-8fb6-db52d1a758a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 20:20:24.499903 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:20:24.499856 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" podUID="c1435886-d743-48a5-8fb6-db52d1a758a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 20:20:24.500391 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:20:24.499860 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" podUID="bd204361-ed3b-49d5-b44b-647296fb6f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 20:20:34.501169 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:20:34.501073 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" Apr 16 20:20:34.501765 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:20:34.501469 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" Apr 16 20:21:03.232601 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.232568 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4"] Apr 16 20:21:03.233067 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.232883 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" podUID="bd204361-ed3b-49d5-b44b-647296fb6f74" containerName="kserve-container" containerID="cri-o://3e26860bf6b166bb8e20baa9e7d468d7abb610cf9a5d4536b1125095c446cf81" gracePeriod=30 Apr 16 20:21:03.269876 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.269844 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg"] Apr 16 20:21:03.270260 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.270244 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6ad24f9-fa9c-40be-921a-979c72985ebc" containerName="console" Apr 16 20:21:03.270350 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.270263 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ad24f9-fa9c-40be-921a-979c72985ebc" containerName="console" Apr 16 20:21:03.270403 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.270381 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6ad24f9-fa9c-40be-921a-979c72985ebc" containerName="console" Apr 16 20:21:03.272779 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.272756 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" Apr 16 20:21:03.284632 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.284602 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg"] Apr 16 20:21:03.286202 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.286180 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" Apr 16 20:21:03.319600 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.319569 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf"] Apr 16 20:21:03.319824 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.319793 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" podUID="c1435886-d743-48a5-8fb6-db52d1a758a0" containerName="kserve-container" containerID="cri-o://9caa27287540db85c24593bdf7f1517cbde9205e22bc9a5fd3737eac00d8eb15" gracePeriod=30 Apr 16 20:21:03.345170 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.345139 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv"] Apr 16 20:21:03.348886 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.348861 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" Apr 16 20:21:03.362175 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.362152 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" Apr 16 20:21:03.372976 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.372944 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv"] Apr 16 20:21:03.444200 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.444163 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg"] Apr 16 20:21:03.447794 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:21:03.447759 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda20e661c_b2ed_4af5_82e1_e2e92d29e099.slice/crio-2ee0b2f20087e8d350560eb813654fc731fabfcbe70ab1153e8bd7e61cfd0c4f WatchSource:0}: Error finding container 2ee0b2f20087e8d350560eb813654fc731fabfcbe70ab1153e8bd7e61cfd0c4f: Status 404 returned error can't find the container with id 2ee0b2f20087e8d350560eb813654fc731fabfcbe70ab1153e8bd7e61cfd0c4f Apr 16 20:21:03.513293 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.513238 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv"] Apr 16 20:21:03.515508 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:21:03.515472 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdff0dbf_03bb_42bc_bb18_fd63c8f96524.slice/crio-c545b7546eb55c57ba18bd9d9acffa2453db3885fb98fbc70f26aaf07df076ad WatchSource:0}: Error finding container c545b7546eb55c57ba18bd9d9acffa2453db3885fb98fbc70f26aaf07df076ad: Status 404 returned error can't find the container with id c545b7546eb55c57ba18bd9d9acffa2453db3885fb98fbc70f26aaf07df076ad Apr 16 20:21:03.762958 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.762862 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" event={"ID":"bdff0dbf-03bb-42bc-bb18-fd63c8f96524","Type":"ContainerStarted","Data":"7444a48b003b26d6ec4ce4fddbe4b1a4302aa5a1440b8d87f1d0687dd42ad6fa"} Apr 16 20:21:03.763206 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.763164 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" event={"ID":"bdff0dbf-03bb-42bc-bb18-fd63c8f96524","Type":"ContainerStarted","Data":"c545b7546eb55c57ba18bd9d9acffa2453db3885fb98fbc70f26aaf07df076ad"} Apr 16 20:21:03.763206 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.763196 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" Apr 16 20:21:03.764399 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.764365 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" podUID="bdff0dbf-03bb-42bc-bb18-fd63c8f96524" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:21:03.764656 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.764631 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" event={"ID":"a20e661c-b2ed-4af5-82e1-e2e92d29e099","Type":"ContainerStarted","Data":"e1407fc9a93173d2fa0e5b5d205fe567c4a495277b9c1d92b4776b94ae6075de"} Apr 16 20:21:03.764752 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.764667 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" event={"ID":"a20e661c-b2ed-4af5-82e1-e2e92d29e099","Type":"ContainerStarted","Data":"2ee0b2f20087e8d350560eb813654fc731fabfcbe70ab1153e8bd7e61cfd0c4f"} Apr 16 20:21:03.764896 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.764871 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" Apr 16 20:21:03.766343 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.766312 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" podUID="a20e661c-b2ed-4af5-82e1-e2e92d29e099" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 20:21:03.790804 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.790758 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" podStartSLOduration=0.790743902 podStartE2EDuration="790.743902ms" podCreationTimestamp="2026-04-16 20:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:21:03.790001702 +0000 UTC m=+551.451599210" watchObservedRunningTime="2026-04-16 20:21:03.790743902 +0000 UTC m=+551.452341390" Apr 16 20:21:03.791197 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:03.791174 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" podStartSLOduration=0.791167754 podStartE2EDuration="791.167754ms" podCreationTimestamp="2026-04-16 20:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:21:03.776809434 +0000 UTC m=+551.438406928" watchObservedRunningTime="2026-04-16 20:21:03.791167754 +0000 UTC m=+551.452765240" Apr 16 20:21:04.500669 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:04.500621 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" podUID="c1435886-d743-48a5-8fb6-db52d1a758a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 20:21:04.501053 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:04.500621 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" podUID="bd204361-ed3b-49d5-b44b-647296fb6f74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 20:21:04.769324 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:04.769228 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" podUID="bdff0dbf-03bb-42bc-bb18-fd63c8f96524" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:21:04.769519 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:04.769228 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" podUID="a20e661c-b2ed-4af5-82e1-e2e92d29e099" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 20:21:06.640834 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.640810 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" Apr 16 20:21:06.756565 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.756543 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" Apr 16 20:21:06.777087 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.777055 2569 generic.go:358] "Generic (PLEG): container finished" podID="c1435886-d743-48a5-8fb6-db52d1a758a0" containerID="9caa27287540db85c24593bdf7f1517cbde9205e22bc9a5fd3737eac00d8eb15" exitCode=0 Apr 16 20:21:06.777279 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.777126 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" event={"ID":"c1435886-d743-48a5-8fb6-db52d1a758a0","Type":"ContainerDied","Data":"9caa27287540db85c24593bdf7f1517cbde9205e22bc9a5fd3737eac00d8eb15"} Apr 16 20:21:06.777279 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.777160 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" event={"ID":"c1435886-d743-48a5-8fb6-db52d1a758a0","Type":"ContainerDied","Data":"dc12fb0c64f31588290fbcd88f914e2ae51a4df2e5fafdf38f381e5e28d956cd"} Apr 16 20:21:06.777279 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.777178 2569 scope.go:117] "RemoveContainer" containerID="9caa27287540db85c24593bdf7f1517cbde9205e22bc9a5fd3737eac00d8eb15" Apr 16 20:21:06.777279 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.777131 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf" Apr 16 20:21:06.778298 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.778277 2569 generic.go:358] "Generic (PLEG): container finished" podID="bd204361-ed3b-49d5-b44b-647296fb6f74" containerID="3e26860bf6b166bb8e20baa9e7d468d7abb610cf9a5d4536b1125095c446cf81" exitCode=0 Apr 16 20:21:06.778401 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.778343 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" Apr 16 20:21:06.778401 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.778356 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" event={"ID":"bd204361-ed3b-49d5-b44b-647296fb6f74","Type":"ContainerDied","Data":"3e26860bf6b166bb8e20baa9e7d468d7abb610cf9a5d4536b1125095c446cf81"} Apr 16 20:21:06.778401 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.778377 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4" event={"ID":"bd204361-ed3b-49d5-b44b-647296fb6f74","Type":"ContainerDied","Data":"e9350ff1aea600d3e6607002988f7e2ebae2f8c031ae60e99a1cf114089a0d67"} Apr 16 20:21:06.785401 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.785383 2569 scope.go:117] "RemoveContainer" containerID="9caa27287540db85c24593bdf7f1517cbde9205e22bc9a5fd3737eac00d8eb15" Apr 16 20:21:06.785680 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:21:06.785658 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9caa27287540db85c24593bdf7f1517cbde9205e22bc9a5fd3737eac00d8eb15\": container with ID starting with 9caa27287540db85c24593bdf7f1517cbde9205e22bc9a5fd3737eac00d8eb15 not found: ID does not exist" containerID="9caa27287540db85c24593bdf7f1517cbde9205e22bc9a5fd3737eac00d8eb15" Apr 16 20:21:06.785783 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.785685 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9caa27287540db85c24593bdf7f1517cbde9205e22bc9a5fd3737eac00d8eb15"} err="failed to get container status \"9caa27287540db85c24593bdf7f1517cbde9205e22bc9a5fd3737eac00d8eb15\": rpc error: code = NotFound desc = could not find container \"9caa27287540db85c24593bdf7f1517cbde9205e22bc9a5fd3737eac00d8eb15\": container with ID starting with 9caa27287540db85c24593bdf7f1517cbde9205e22bc9a5fd3737eac00d8eb15 not found: ID does not exist" Apr 16 20:21:06.785783 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.785702 2569 scope.go:117] "RemoveContainer" containerID="3e26860bf6b166bb8e20baa9e7d468d7abb610cf9a5d4536b1125095c446cf81" Apr 16 20:21:06.793485 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.793463 2569 scope.go:117] "RemoveContainer" containerID="3e26860bf6b166bb8e20baa9e7d468d7abb610cf9a5d4536b1125095c446cf81" Apr 16 20:21:06.793753 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:21:06.793730 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e26860bf6b166bb8e20baa9e7d468d7abb610cf9a5d4536b1125095c446cf81\": container with ID starting with 3e26860bf6b166bb8e20baa9e7d468d7abb610cf9a5d4536b1125095c446cf81 not found: ID does not exist" containerID="3e26860bf6b166bb8e20baa9e7d468d7abb610cf9a5d4536b1125095c446cf81" Apr 16 20:21:06.793811 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.793765 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e26860bf6b166bb8e20baa9e7d468d7abb610cf9a5d4536b1125095c446cf81"} err="failed to get container status \"3e26860bf6b166bb8e20baa9e7d468d7abb610cf9a5d4536b1125095c446cf81\": rpc error: code = NotFound desc = could not find container \"3e26860bf6b166bb8e20baa9e7d468d7abb610cf9a5d4536b1125095c446cf81\": container with ID starting with 3e26860bf6b166bb8e20baa9e7d468d7abb610cf9a5d4536b1125095c446cf81 not found: ID does not exist" Apr 16 20:21:06.802197 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.802160 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4"] Apr 16 20:21:06.803678 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.803656 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-19633-predictor-848c5d7698-5m9m4"] Apr 16 20:21:06.812357 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.812324 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf"] Apr 16 20:21:06.815664 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.815641 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-19633-predictor-7669bd77f9-qhlcf"] Apr 16 20:21:06.949697 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.949618 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd204361-ed3b-49d5-b44b-647296fb6f74" path="/var/lib/kubelet/pods/bd204361-ed3b-49d5-b44b-647296fb6f74/volumes" Apr 16 20:21:06.949865 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:06.949853 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1435886-d743-48a5-8fb6-db52d1a758a0" path="/var/lib/kubelet/pods/c1435886-d743-48a5-8fb6-db52d1a758a0/volumes" Apr 16 20:21:14.770012 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:14.769966 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" podUID="bdff0dbf-03bb-42bc-bb18-fd63c8f96524" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:21:14.770456 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:14.769966 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" podUID="a20e661c-b2ed-4af5-82e1-e2e92d29e099" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 20:21:24.769689 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:24.769626 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" podUID="bdff0dbf-03bb-42bc-bb18-fd63c8f96524" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:21:24.770297 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:24.769626 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" podUID="a20e661c-b2ed-4af5-82e1-e2e92d29e099" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 20:21:34.769972 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:34.769919 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" podUID="bdff0dbf-03bb-42bc-bb18-fd63c8f96524" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:21:34.770506 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:34.769919 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" podUID="a20e661c-b2ed-4af5-82e1-e2e92d29e099" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 20:21:44.769666 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:44.769617 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" podUID="a20e661c-b2ed-4af5-82e1-e2e92d29e099" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 20:21:44.770151 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:44.769616 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" podUID="bdff0dbf-03bb-42bc-bb18-fd63c8f96524" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:21:49.205328 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.205297 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn"] Apr 16 20:21:49.205691 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.205643 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd204361-ed3b-49d5-b44b-647296fb6f74" containerName="kserve-container" Apr 16 20:21:49.205691 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.205653 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd204361-ed3b-49d5-b44b-647296fb6f74" containerName="kserve-container" Apr 16 20:21:49.205691 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.205662 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1435886-d743-48a5-8fb6-db52d1a758a0" containerName="kserve-container" Apr 16 20:21:49.205691 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.205668 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1435886-d743-48a5-8fb6-db52d1a758a0" containerName="kserve-container" Apr 16 20:21:49.205818 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.205720 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1435886-d743-48a5-8fb6-db52d1a758a0" containerName="kserve-container" Apr 16 20:21:49.205818 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.205730 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd204361-ed3b-49d5-b44b-647296fb6f74" containerName="kserve-container" Apr 16 20:21:49.208813 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.208796 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" Apr 16 20:21:49.218936 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.218914 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" Apr 16 20:21:49.225307 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.225272 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn"] Apr 16 20:21:49.343437 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.343403 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl"] Apr 16 20:21:49.347872 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.347852 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" Apr 16 20:21:49.358150 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.358055 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" Apr 16 20:21:49.360536 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.360510 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl"] Apr 16 20:21:49.366701 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.366670 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn"] Apr 16 20:21:49.490684 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.490661 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl"] Apr 16 20:21:49.492728 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:21:49.492696 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7bb9573_c41d_47dd_9f4a_99d642651fff.slice/crio-bdf7a93f51587f50d02a784bcb1b8cf20976669463441f45b6e86b2a6af23fad WatchSource:0}: Error finding container bdf7a93f51587f50d02a784bcb1b8cf20976669463441f45b6e86b2a6af23fad: Status 404 returned error can't find the container with id bdf7a93f51587f50d02a784bcb1b8cf20976669463441f45b6e86b2a6af23fad Apr 16 20:21:49.924226 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.924186 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" event={"ID":"499bbfc5-41eb-4e33-a4bb-07d27515e806","Type":"ContainerStarted","Data":"64ec4da64a57d529f5afe9d3f8efa908d975c6aa3b9f2940d1e012d96248d435"} Apr 16 20:21:49.924445 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.924248 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" event={"ID":"499bbfc5-41eb-4e33-a4bb-07d27515e806","Type":"ContainerStarted","Data":"0849bcb06575acbd632a84e3e594297c5ae85a366a3dd0c13774ac92eff37332"} Apr 16 20:21:49.924445 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.924268 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" Apr 16 20:21:49.925579 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.925549 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" event={"ID":"e7bb9573-c41d-47dd-9f4a-99d642651fff","Type":"ContainerStarted","Data":"0030965ebc3c9bf67f3035d516bb2ac41688769f157f728f715516b054ae6f64"} Apr 16 20:21:49.925714 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.925585 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" event={"ID":"e7bb9573-c41d-47dd-9f4a-99d642651fff","Type":"ContainerStarted","Data":"bdf7a93f51587f50d02a784bcb1b8cf20976669463441f45b6e86b2a6af23fad"} Apr 16 20:21:49.925784 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.925767 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" Apr 16 20:21:49.926132 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.926108 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" podUID="499bbfc5-41eb-4e33-a4bb-07d27515e806" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 20:21:49.926718 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.926692 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" podUID="e7bb9573-c41d-47dd-9f4a-99d642651fff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 20:21:49.940524 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.940482 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" podStartSLOduration=0.940469339 podStartE2EDuration="940.469339ms" podCreationTimestamp="2026-04-16 20:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:21:49.938690981 +0000 UTC m=+597.600288470" watchObservedRunningTime="2026-04-16 20:21:49.940469339 +0000 UTC m=+597.602066827" Apr 16 20:21:49.952182 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:49.952140 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" podStartSLOduration=0.952126324 podStartE2EDuration="952.126324ms" podCreationTimestamp="2026-04-16 20:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:21:49.951775481 +0000 UTC m=+597.613372971" watchObservedRunningTime="2026-04-16 20:21:49.952126324 +0000 UTC m=+597.613723813" Apr 16 20:21:50.929600 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:50.929561 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" podUID="499bbfc5-41eb-4e33-a4bb-07d27515e806" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 20:21:50.929967 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:50.929569 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" podUID="e7bb9573-c41d-47dd-9f4a-99d642651fff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 20:21:54.770411 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:54.770380 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" Apr 16 20:21:54.772956 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:21:54.770626 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" Apr 16 20:22:00.929938 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:22:00.929884 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" podUID="e7bb9573-c41d-47dd-9f4a-99d642651fff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 20:22:00.930442 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:22:00.929884 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" podUID="499bbfc5-41eb-4e33-a4bb-07d27515e806" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 20:22:10.929728 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:22:10.929626 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" podUID="e7bb9573-c41d-47dd-9f4a-99d642651fff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 20:22:10.930141 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:22:10.929626 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" podUID="499bbfc5-41eb-4e33-a4bb-07d27515e806" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 20:22:20.930166 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:22:20.930123 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" podUID="e7bb9573-c41d-47dd-9f4a-99d642651fff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 20:22:20.930557 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:22:20.930126 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" podUID="499bbfc5-41eb-4e33-a4bb-07d27515e806" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 20:22:30.929934 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:22:30.929887 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" podUID="499bbfc5-41eb-4e33-a4bb-07d27515e806" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 20:22:30.930355 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:22:30.929888 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" podUID="e7bb9573-c41d-47dd-9f4a-99d642651fff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 16 20:22:40.930858 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:22:40.930817 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" Apr 16 20:22:40.931420 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:22:40.931271 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" Apr 16 20:30:28.239819 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.239783 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg"] Apr 16 20:30:28.240528 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.240133 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" podUID="a20e661c-b2ed-4af5-82e1-e2e92d29e099" containerName="kserve-container" containerID="cri-o://e1407fc9a93173d2fa0e5b5d205fe567c4a495277b9c1d92b4776b94ae6075de" gracePeriod=30 Apr 16 20:30:28.302040 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.302000 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8"] Apr 16 20:30:28.305524 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.305506 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" Apr 16 20:30:28.312721 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.312683 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv"] Apr 16 20:30:28.313206 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.313170 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" podUID="bdff0dbf-03bb-42bc-bb18-fd63c8f96524" containerName="kserve-container" containerID="cri-o://7444a48b003b26d6ec4ce4fddbe4b1a4302aa5a1440b8d87f1d0687dd42ad6fa" gracePeriod=30 Apr 16 20:30:28.317971 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.317952 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" Apr 16 20:30:28.324066 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.324034 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8"] Apr 16 20:30:28.359156 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.359107 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8"] Apr 16 20:30:28.363428 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.363398 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" Apr 16 20:30:28.375116 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.374985 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8"] Apr 16 20:30:28.376879 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.376853 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" Apr 16 20:30:28.466490 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.466395 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8"] Apr 16 20:30:28.469998 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:30:28.469966 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0ace3a_5d37_40cc_ba6d_d0413d21d79d.slice/crio-3f6c82bde72997198c1eea3497105fdab28e581223332eaa1d8ee4ebdc997f93 WatchSource:0}: Error finding container 3f6c82bde72997198c1eea3497105fdab28e581223332eaa1d8ee4ebdc997f93: Status 404 returned error can't find the container with id 3f6c82bde72997198c1eea3497105fdab28e581223332eaa1d8ee4ebdc997f93 Apr 16 20:30:28.472255 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.472237 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:30:28.533778 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.533637 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8"] Apr 16 20:30:28.537146 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:30:28.537121 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71feaec2_df07_4dc8_a1dd_5ba0e77589c4.slice/crio-0d74a1e1548db79b429a30db9fc8f9bb266fe38ad9234dc51556a1ebc3ec4d3d WatchSource:0}: Error finding container 0d74a1e1548db79b429a30db9fc8f9bb266fe38ad9234dc51556a1ebc3ec4d3d: Status 404 returned error can't find the container with id 0d74a1e1548db79b429a30db9fc8f9bb266fe38ad9234dc51556a1ebc3ec4d3d Apr 16 20:30:28.628091 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.628050 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" event={"ID":"71feaec2-df07-4dc8-a1dd-5ba0e77589c4","Type":"ContainerStarted","Data":"0d74a1e1548db79b429a30db9fc8f9bb266fe38ad9234dc51556a1ebc3ec4d3d"} Apr 16 20:30:28.629349 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.629322 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" event={"ID":"ec0ace3a-5d37-40cc-ba6d-d0413d21d79d","Type":"ContainerStarted","Data":"ecf2f75a4c6967c506af4ab94070ce3ba7f22fca8328ada8e1cf5704fb2628ff"} Apr 16 20:30:28.629481 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.629353 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" event={"ID":"ec0ace3a-5d37-40cc-ba6d-d0413d21d79d","Type":"ContainerStarted","Data":"3f6c82bde72997198c1eea3497105fdab28e581223332eaa1d8ee4ebdc997f93"} Apr 16 20:30:28.629545 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.629526 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" Apr 16 20:30:28.630850 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.630826 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" podUID="ec0ace3a-5d37-40cc-ba6d-d0413d21d79d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 20:30:28.645703 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:28.645658 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" podStartSLOduration=0.645644736 podStartE2EDuration="645.644736ms" podCreationTimestamp="2026-04-16 20:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:30:28.64307828 +0000 UTC m=+1116.304675769" watchObservedRunningTime="2026-04-16 20:30:28.645644736 +0000 UTC m=+1116.307242216" Apr 16 20:30:29.633935 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:29.633897 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" event={"ID":"71feaec2-df07-4dc8-a1dd-5ba0e77589c4","Type":"ContainerStarted","Data":"3cf56d67778de5e9489b9772b5c67904f4642c8e6cb0fc59934411b42e49f22d"} Apr 16 20:30:29.634356 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:29.634131 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" Apr 16 20:30:29.634356 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:29.634258 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" podUID="ec0ace3a-5d37-40cc-ba6d-d0413d21d79d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 20:30:29.635380 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:29.635353 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" podUID="71feaec2-df07-4dc8-a1dd-5ba0e77589c4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 20:30:29.651750 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:29.651440 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" podStartSLOduration=1.65142302 podStartE2EDuration="1.65142302s" podCreationTimestamp="2026-04-16 20:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:30:29.651191437 +0000 UTC m=+1117.312788921" watchObservedRunningTime="2026-04-16 20:30:29.65142302 +0000 UTC m=+1117.313020511" Apr 16 20:30:30.637796 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:30.637759 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" podUID="71feaec2-df07-4dc8-a1dd-5ba0e77589c4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 20:30:31.834139 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:31.834117 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" Apr 16 20:30:31.964062 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:31.964036 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" Apr 16 20:30:32.645975 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.645935 2569 generic.go:358] "Generic (PLEG): container finished" podID="bdff0dbf-03bb-42bc-bb18-fd63c8f96524" containerID="7444a48b003b26d6ec4ce4fddbe4b1a4302aa5a1440b8d87f1d0687dd42ad6fa" exitCode=0 Apr 16 20:30:32.646155 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.645995 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" Apr 16 20:30:32.646155 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.646018 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" event={"ID":"bdff0dbf-03bb-42bc-bb18-fd63c8f96524","Type":"ContainerDied","Data":"7444a48b003b26d6ec4ce4fddbe4b1a4302aa5a1440b8d87f1d0687dd42ad6fa"} Apr 16 20:30:32.646155 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.646050 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv" event={"ID":"bdff0dbf-03bb-42bc-bb18-fd63c8f96524","Type":"ContainerDied","Data":"c545b7546eb55c57ba18bd9d9acffa2453db3885fb98fbc70f26aaf07df076ad"} Apr 16 20:30:32.646155 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.646067 2569 scope.go:117] "RemoveContainer" containerID="7444a48b003b26d6ec4ce4fddbe4b1a4302aa5a1440b8d87f1d0687dd42ad6fa" Apr 16 20:30:32.647251 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.647204 2569 generic.go:358] "Generic (PLEG): container finished" podID="a20e661c-b2ed-4af5-82e1-e2e92d29e099" containerID="e1407fc9a93173d2fa0e5b5d205fe567c4a495277b9c1d92b4776b94ae6075de" exitCode=0 Apr 16 20:30:32.647251 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.647247 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" event={"ID":"a20e661c-b2ed-4af5-82e1-e2e92d29e099","Type":"ContainerDied","Data":"e1407fc9a93173d2fa0e5b5d205fe567c4a495277b9c1d92b4776b94ae6075de"} Apr 16 20:30:32.647432 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.647272 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" Apr 16 20:30:32.647432 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.647289 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg" event={"ID":"a20e661c-b2ed-4af5-82e1-e2e92d29e099","Type":"ContainerDied","Data":"2ee0b2f20087e8d350560eb813654fc731fabfcbe70ab1153e8bd7e61cfd0c4f"} Apr 16 20:30:32.654297 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.654282 2569 scope.go:117] "RemoveContainer" containerID="7444a48b003b26d6ec4ce4fddbe4b1a4302aa5a1440b8d87f1d0687dd42ad6fa" Apr 16 20:30:32.654555 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:30:32.654536 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7444a48b003b26d6ec4ce4fddbe4b1a4302aa5a1440b8d87f1d0687dd42ad6fa\": container with ID starting with 7444a48b003b26d6ec4ce4fddbe4b1a4302aa5a1440b8d87f1d0687dd42ad6fa not found: ID does not exist" containerID="7444a48b003b26d6ec4ce4fddbe4b1a4302aa5a1440b8d87f1d0687dd42ad6fa" Apr 16 20:30:32.654624 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.654565 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7444a48b003b26d6ec4ce4fddbe4b1a4302aa5a1440b8d87f1d0687dd42ad6fa"} err="failed to get container status \"7444a48b003b26d6ec4ce4fddbe4b1a4302aa5a1440b8d87f1d0687dd42ad6fa\": rpc error: code = NotFound desc = could not find container \"7444a48b003b26d6ec4ce4fddbe4b1a4302aa5a1440b8d87f1d0687dd42ad6fa\": container with ID starting with 7444a48b003b26d6ec4ce4fddbe4b1a4302aa5a1440b8d87f1d0687dd42ad6fa not found: ID does not exist" Apr 16 20:30:32.654624 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.654583 2569 scope.go:117] "RemoveContainer" containerID="e1407fc9a93173d2fa0e5b5d205fe567c4a495277b9c1d92b4776b94ae6075de" Apr 16 20:30:32.662466 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.662333 2569 scope.go:117] "RemoveContainer" containerID="e1407fc9a93173d2fa0e5b5d205fe567c4a495277b9c1d92b4776b94ae6075de" Apr 16 20:30:32.662618 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:30:32.662596 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1407fc9a93173d2fa0e5b5d205fe567c4a495277b9c1d92b4776b94ae6075de\": container with ID starting with e1407fc9a93173d2fa0e5b5d205fe567c4a495277b9c1d92b4776b94ae6075de not found: ID does not exist" containerID="e1407fc9a93173d2fa0e5b5d205fe567c4a495277b9c1d92b4776b94ae6075de" Apr 16 20:30:32.662661 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.662632 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1407fc9a93173d2fa0e5b5d205fe567c4a495277b9c1d92b4776b94ae6075de"} err="failed to get container status \"e1407fc9a93173d2fa0e5b5d205fe567c4a495277b9c1d92b4776b94ae6075de\": rpc error: code = NotFound desc = could not find container \"e1407fc9a93173d2fa0e5b5d205fe567c4a495277b9c1d92b4776b94ae6075de\": container with ID starting with e1407fc9a93173d2fa0e5b5d205fe567c4a495277b9c1d92b4776b94ae6075de not found: ID does not exist" Apr 16 20:30:32.669794 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.669766 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg"] Apr 16 20:30:32.673659 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.673637 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-15788-predictor-684db9cccb-rzgzg"] Apr 16 20:30:32.682002 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.681980 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv"] Apr 16 20:30:32.687424 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.687401 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-15788-predictor-68bbf9d949-kglrv"] Apr 16 20:30:32.950660 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.950580 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a20e661c-b2ed-4af5-82e1-e2e92d29e099" path="/var/lib/kubelet/pods/a20e661c-b2ed-4af5-82e1-e2e92d29e099/volumes" Apr 16 20:30:32.950998 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:32.950825 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdff0dbf-03bb-42bc-bb18-fd63c8f96524" path="/var/lib/kubelet/pods/bdff0dbf-03bb-42bc-bb18-fd63c8f96524/volumes" Apr 16 20:30:39.634804 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:39.634762 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" podUID="ec0ace3a-5d37-40cc-ba6d-d0413d21d79d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 20:30:40.638371 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:40.638325 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" podUID="71feaec2-df07-4dc8-a1dd-5ba0e77589c4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 20:30:49.635192 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:49.635140 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" podUID="ec0ace3a-5d37-40cc-ba6d-d0413d21d79d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 20:30:50.638793 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:50.638740 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" podUID="71feaec2-df07-4dc8-a1dd-5ba0e77589c4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 20:30:59.634478 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:30:59.634431 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" podUID="ec0ace3a-5d37-40cc-ba6d-d0413d21d79d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 20:31:00.638168 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:00.638115 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" podUID="71feaec2-df07-4dc8-a1dd-5ba0e77589c4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 20:31:09.634581 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:09.634483 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" podUID="ec0ace3a-5d37-40cc-ba6d-d0413d21d79d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 20:31:10.637914 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:10.637865 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" podUID="71feaec2-df07-4dc8-a1dd-5ba0e77589c4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 20:31:14.064016 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.063981 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl"] Apr 16 20:31:14.064425 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.064269 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" podUID="e7bb9573-c41d-47dd-9f4a-99d642651fff" containerName="kserve-container" containerID="cri-o://0030965ebc3c9bf67f3035d516bb2ac41688769f157f728f715516b054ae6f64" gracePeriod=30 Apr 16 20:31:14.119364 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.119333 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn"] Apr 16 20:31:14.119625 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.119601 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" podUID="499bbfc5-41eb-4e33-a4bb-07d27515e806" containerName="kserve-container" containerID="cri-o://64ec4da64a57d529f5afe9d3f8efa908d975c6aa3b9f2940d1e012d96248d435" gracePeriod=30 Apr 16 20:31:14.136267 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.136238 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs"] Apr 16 20:31:14.136718 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.136698 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a20e661c-b2ed-4af5-82e1-e2e92d29e099" containerName="kserve-container" Apr 16 20:31:14.136718 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.136719 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20e661c-b2ed-4af5-82e1-e2e92d29e099" containerName="kserve-container" Apr 16 20:31:14.136873 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.136742 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdff0dbf-03bb-42bc-bb18-fd63c8f96524" containerName="kserve-container" Apr 16 20:31:14.136873 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.136751 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdff0dbf-03bb-42bc-bb18-fd63c8f96524" containerName="kserve-container" Apr 16 20:31:14.136873 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.136828 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdff0dbf-03bb-42bc-bb18-fd63c8f96524" containerName="kserve-container" Apr 16 20:31:14.136873 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.136842 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a20e661c-b2ed-4af5-82e1-e2e92d29e099" containerName="kserve-container" Apr 16 20:31:14.139814 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.139794 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" Apr 16 20:31:14.149803 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.149779 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" Apr 16 20:31:14.149803 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.149792 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs"] Apr 16 20:31:14.213246 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.212995 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d"] Apr 16 20:31:14.217577 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.217552 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" Apr 16 20:31:14.223382 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.223145 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d"] Apr 16 20:31:14.235574 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.235551 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" Apr 16 20:31:14.295445 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.295142 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs"] Apr 16 20:31:14.300918 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:31:14.300746 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76aa1c41_c651_43d6_97aa_7702d68bcbf9.slice/crio-11c90d008554b1222bbbe4c95b83ddeef2b05de3e2adb5806e261042015ca011 WatchSource:0}: Error finding container 11c90d008554b1222bbbe4c95b83ddeef2b05de3e2adb5806e261042015ca011: Status 404 returned error can't find the container with id 11c90d008554b1222bbbe4c95b83ddeef2b05de3e2adb5806e261042015ca011 Apr 16 20:31:14.371148 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.371114 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d"] Apr 16 20:31:14.374150 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:31:14.374117 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbd50b50_d77c_4d15_9788_4b3c481b1f7b.slice/crio-c81e8e9b1dcfb092f0d5b1c818c3faaffa5e2dbbc1d0e96565e09e745f080c72 WatchSource:0}: Error finding container c81e8e9b1dcfb092f0d5b1c818c3faaffa5e2dbbc1d0e96565e09e745f080c72: Status 404 returned error can't find the container with id c81e8e9b1dcfb092f0d5b1c818c3faaffa5e2dbbc1d0e96565e09e745f080c72 Apr 16 20:31:14.786394 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.786356 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" event={"ID":"dbd50b50-d77c-4d15-9788-4b3c481b1f7b","Type":"ContainerStarted","Data":"fee1ce78132290645bee6ca2f0db62155bd68b003848332e0eed068c87de7a8f"} Apr 16 20:31:14.786394 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.786393 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" event={"ID":"dbd50b50-d77c-4d15-9788-4b3c481b1f7b","Type":"ContainerStarted","Data":"c81e8e9b1dcfb092f0d5b1c818c3faaffa5e2dbbc1d0e96565e09e745f080c72"} Apr 16 20:31:14.786692 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.786578 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" Apr 16 20:31:14.787780 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.787754 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" event={"ID":"76aa1c41-c651-43d6-97aa-7702d68bcbf9","Type":"ContainerStarted","Data":"6f7224056ab6532df7a251bfaeeaed7eb8367bee6ccbf0ed83caebee03d8a77d"} Apr 16 20:31:14.787780 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.787780 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" event={"ID":"76aa1c41-c651-43d6-97aa-7702d68bcbf9","Type":"ContainerStarted","Data":"11c90d008554b1222bbbe4c95b83ddeef2b05de3e2adb5806e261042015ca011"} Apr 16 20:31:14.787965 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.787946 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" Apr 16 20:31:14.788259 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.788233 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" podUID="dbd50b50-d77c-4d15-9788-4b3c481b1f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 20:31:14.788912 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.788889 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" podUID="76aa1c41-c651-43d6-97aa-7702d68bcbf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:31:14.800789 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.800739 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" podStartSLOduration=0.800722754 podStartE2EDuration="800.722754ms" podCreationTimestamp="2026-04-16 20:31:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:31:14.800135463 +0000 UTC m=+1162.461732963" watchObservedRunningTime="2026-04-16 20:31:14.800722754 +0000 UTC m=+1162.462320245" Apr 16 20:31:14.814721 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:14.814680 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" podStartSLOduration=0.814666729 podStartE2EDuration="814.666729ms" podCreationTimestamp="2026-04-16 20:31:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:31:14.81279306 +0000 UTC m=+1162.474390548" watchObservedRunningTime="2026-04-16 20:31:14.814666729 +0000 UTC m=+1162.476264217" Apr 16 20:31:15.791295 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:15.791246 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" podUID="76aa1c41-c651-43d6-97aa-7702d68bcbf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:31:15.791295 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:15.791271 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" podUID="dbd50b50-d77c-4d15-9788-4b3c481b1f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 20:31:17.799049 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:17.799015 2569 generic.go:358] "Generic (PLEG): container finished" podID="e7bb9573-c41d-47dd-9f4a-99d642651fff" containerID="0030965ebc3c9bf67f3035d516bb2ac41688769f157f728f715516b054ae6f64" exitCode=0 Apr 16 20:31:17.799462 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:17.799098 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" event={"ID":"e7bb9573-c41d-47dd-9f4a-99d642651fff","Type":"ContainerDied","Data":"0030965ebc3c9bf67f3035d516bb2ac41688769f157f728f715516b054ae6f64"} Apr 16 20:31:17.817530 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:17.817506 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" Apr 16 20:31:18.354260 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.354237 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" Apr 16 20:31:18.803475 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.803432 2569 generic.go:358] "Generic (PLEG): container finished" podID="499bbfc5-41eb-4e33-a4bb-07d27515e806" containerID="64ec4da64a57d529f5afe9d3f8efa908d975c6aa3b9f2940d1e012d96248d435" exitCode=0 Apr 16 20:31:18.803943 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.803520 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" Apr 16 20:31:18.803943 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.803528 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" event={"ID":"499bbfc5-41eb-4e33-a4bb-07d27515e806","Type":"ContainerDied","Data":"64ec4da64a57d529f5afe9d3f8efa908d975c6aa3b9f2940d1e012d96248d435"} Apr 16 20:31:18.803943 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.803570 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn" event={"ID":"499bbfc5-41eb-4e33-a4bb-07d27515e806","Type":"ContainerDied","Data":"0849bcb06575acbd632a84e3e594297c5ae85a366a3dd0c13774ac92eff37332"} Apr 16 20:31:18.803943 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.803591 2569 scope.go:117] "RemoveContainer" containerID="64ec4da64a57d529f5afe9d3f8efa908d975c6aa3b9f2940d1e012d96248d435" Apr 16 20:31:18.804919 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.804870 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" event={"ID":"e7bb9573-c41d-47dd-9f4a-99d642651fff","Type":"ContainerDied","Data":"bdf7a93f51587f50d02a784bcb1b8cf20976669463441f45b6e86b2a6af23fad"} Apr 16 20:31:18.805006 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.804919 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl" Apr 16 20:31:18.813867 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.813841 2569 scope.go:117] "RemoveContainer" containerID="64ec4da64a57d529f5afe9d3f8efa908d975c6aa3b9f2940d1e012d96248d435" Apr 16 20:31:18.814234 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:31:18.814191 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64ec4da64a57d529f5afe9d3f8efa908d975c6aa3b9f2940d1e012d96248d435\": container with ID starting with 64ec4da64a57d529f5afe9d3f8efa908d975c6aa3b9f2940d1e012d96248d435 not found: ID does not exist" containerID="64ec4da64a57d529f5afe9d3f8efa908d975c6aa3b9f2940d1e012d96248d435" Apr 16 20:31:18.814322 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.814245 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64ec4da64a57d529f5afe9d3f8efa908d975c6aa3b9f2940d1e012d96248d435"} err="failed to get container status \"64ec4da64a57d529f5afe9d3f8efa908d975c6aa3b9f2940d1e012d96248d435\": rpc error: code = NotFound desc = could not find container \"64ec4da64a57d529f5afe9d3f8efa908d975c6aa3b9f2940d1e012d96248d435\": container with ID starting with 64ec4da64a57d529f5afe9d3f8efa908d975c6aa3b9f2940d1e012d96248d435 not found: ID does not exist" Apr 16 20:31:18.814322 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.814269 2569 scope.go:117] "RemoveContainer" containerID="0030965ebc3c9bf67f3035d516bb2ac41688769f157f728f715516b054ae6f64" Apr 16 20:31:18.826720 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.826692 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn"] Apr 16 20:31:18.830563 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.830536 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f682f-predictor-7c7bbf9bd6-8bbfn"] Apr 16 20:31:18.843740 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.843707 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl"] Apr 16 20:31:18.847083 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.847057 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f682f-predictor-b9b6c9555-h8dsl"] Apr 16 20:31:18.949590 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.949557 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="499bbfc5-41eb-4e33-a4bb-07d27515e806" path="/var/lib/kubelet/pods/499bbfc5-41eb-4e33-a4bb-07d27515e806/volumes" Apr 16 20:31:18.949819 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:18.949806 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7bb9573-c41d-47dd-9f4a-99d642651fff" path="/var/lib/kubelet/pods/e7bb9573-c41d-47dd-9f4a-99d642651fff/volumes" Apr 16 20:31:19.635413 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:19.635375 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" Apr 16 20:31:20.638946 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:20.638914 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" Apr 16 20:31:25.792120 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:25.792079 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" podUID="dbd50b50-d77c-4d15-9788-4b3c481b1f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 20:31:25.792523 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:25.792079 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" podUID="76aa1c41-c651-43d6-97aa-7702d68bcbf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:31:35.791702 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:35.791657 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" podUID="dbd50b50-d77c-4d15-9788-4b3c481b1f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 20:31:35.792196 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:35.791657 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" podUID="76aa1c41-c651-43d6-97aa-7702d68bcbf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:31:45.792033 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:45.791992 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" podUID="76aa1c41-c651-43d6-97aa-7702d68bcbf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:31:45.792427 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:45.791993 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" podUID="dbd50b50-d77c-4d15-9788-4b3c481b1f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 20:31:48.507252 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.507205 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8"] Apr 16 20:31:48.507663 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.507458 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" podUID="71feaec2-df07-4dc8-a1dd-5ba0e77589c4" containerName="kserve-container" containerID="cri-o://3cf56d67778de5e9489b9772b5c67904f4642c8e6cb0fc59934411b42e49f22d" gracePeriod=30 Apr 16 20:31:48.557015 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.556961 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8"] Apr 16 20:31:48.557379 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.557325 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" podUID="ec0ace3a-5d37-40cc-ba6d-d0413d21d79d" containerName="kserve-container" containerID="cri-o://ecf2f75a4c6967c506af4ab94070ce3ba7f22fca8328ada8e1cf5704fb2628ff" gracePeriod=30 Apr 16 20:31:48.565889 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.565855 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf"] Apr 16 20:31:48.566264 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.566247 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7bb9573-c41d-47dd-9f4a-99d642651fff" containerName="kserve-container" Apr 16 20:31:48.566264 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.566265 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bb9573-c41d-47dd-9f4a-99d642651fff" containerName="kserve-container" Apr 16 20:31:48.566672 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.566294 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="499bbfc5-41eb-4e33-a4bb-07d27515e806" containerName="kserve-container" Apr 16 20:31:48.566672 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.566303 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="499bbfc5-41eb-4e33-a4bb-07d27515e806" containerName="kserve-container" Apr 16 20:31:48.566672 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.566383 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7bb9573-c41d-47dd-9f4a-99d642651fff" containerName="kserve-container" Apr 16 20:31:48.566672 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.566397 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="499bbfc5-41eb-4e33-a4bb-07d27515e806" containerName="kserve-container" Apr 16 20:31:48.569307 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.569285 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" Apr 16 20:31:48.577456 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.577428 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf"] Apr 16 20:31:48.580012 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.579993 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" Apr 16 20:31:48.647526 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.646734 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6"] Apr 16 20:31:48.651957 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.651926 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" Apr 16 20:31:48.658962 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.658845 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6"] Apr 16 20:31:48.664206 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.664130 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" Apr 16 20:31:48.726930 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.726880 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf"] Apr 16 20:31:48.803956 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.803811 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6"] Apr 16 20:31:48.806308 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:31:48.806271 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790136dc_dcdc_4345_8950_63bf199ca8e1.slice/crio-15b05d876b243857e7854468fa8c3423805dce291aa3920b772397faa40e9fbc WatchSource:0}: Error finding container 15b05d876b243857e7854468fa8c3423805dce291aa3920b772397faa40e9fbc: Status 404 returned error can't find the container with id 15b05d876b243857e7854468fa8c3423805dce291aa3920b772397faa40e9fbc Apr 16 20:31:48.906849 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.906816 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" event={"ID":"f2fc63f4-a281-444a-9062-0a7929ab0c1d","Type":"ContainerStarted","Data":"6ab1d81d1ccb5adad620aadf5ae23b1584b01be281fd3adbfcdc84c88d1a40b4"} Apr 16 20:31:48.906849 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.906855 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" Apr 16 20:31:48.907111 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.906866 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" event={"ID":"f2fc63f4-a281-444a-9062-0a7929ab0c1d","Type":"ContainerStarted","Data":"29a714c4fbc70eb73968aa92ed9e7b8129021a9baeb7bb7e9664b75f6ac5f482"} Apr 16 20:31:48.908188 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.908161 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" event={"ID":"790136dc-dcdc-4345-8950-63bf199ca8e1","Type":"ContainerStarted","Data":"35c43f65e82a0f41568feb822d6f35de60d59802cb2d5d497aaf4406046c0f42"} Apr 16 20:31:48.908188 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.908191 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" event={"ID":"790136dc-dcdc-4345-8950-63bf199ca8e1","Type":"ContainerStarted","Data":"15b05d876b243857e7854468fa8c3423805dce291aa3920b772397faa40e9fbc"} Apr 16 20:31:48.908416 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.908390 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" Apr 16 20:31:48.908609 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.908588 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" podUID="f2fc63f4-a281-444a-9062-0a7929ab0c1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:31:48.909355 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.909334 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" podUID="790136dc-dcdc-4345-8950-63bf199ca8e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:31:48.921325 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.921261 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" podStartSLOduration=0.92124299 podStartE2EDuration="921.24299ms" podCreationTimestamp="2026-04-16 20:31:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:31:48.920143753 +0000 UTC m=+1196.581741245" watchObservedRunningTime="2026-04-16 20:31:48.92124299 +0000 UTC m=+1196.582840477" Apr 16 20:31:48.934905 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:48.934850 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" podStartSLOduration=0.934836022 podStartE2EDuration="934.836022ms" podCreationTimestamp="2026-04-16 20:31:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:31:48.933057679 +0000 UTC m=+1196.594655169" watchObservedRunningTime="2026-04-16 20:31:48.934836022 +0000 UTC m=+1196.596433511" Apr 16 20:31:49.634996 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:49.634946 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" podUID="ec0ace3a-5d37-40cc-ba6d-d0413d21d79d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 20:31:49.911910 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:49.911805 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" podUID="f2fc63f4-a281-444a-9062-0a7929ab0c1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:31:49.912106 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:49.912080 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" podUID="790136dc-dcdc-4345-8950-63bf199ca8e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:31:50.637950 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:50.637905 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" podUID="71feaec2-df07-4dc8-a1dd-5ba0e77589c4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 20:31:52.888189 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:52.888161 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" Apr 16 20:31:52.923384 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:52.923294 2569 generic.go:358] "Generic (PLEG): container finished" podID="71feaec2-df07-4dc8-a1dd-5ba0e77589c4" containerID="3cf56d67778de5e9489b9772b5c67904f4642c8e6cb0fc59934411b42e49f22d" exitCode=0 Apr 16 20:31:52.923546 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:52.923340 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" event={"ID":"71feaec2-df07-4dc8-a1dd-5ba0e77589c4","Type":"ContainerDied","Data":"3cf56d67778de5e9489b9772b5c67904f4642c8e6cb0fc59934411b42e49f22d"} Apr 16 20:31:52.923546 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:52.923523 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" event={"ID":"71feaec2-df07-4dc8-a1dd-5ba0e77589c4","Type":"ContainerDied","Data":"0d74a1e1548db79b429a30db9fc8f9bb266fe38ad9234dc51556a1ebc3ec4d3d"} Apr 16 20:31:52.923677 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:52.923549 2569 scope.go:117] "RemoveContainer" containerID="3cf56d67778de5e9489b9772b5c67904f4642c8e6cb0fc59934411b42e49f22d" Apr 16 20:31:52.923677 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:52.923671 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8" Apr 16 20:31:52.925667 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:52.925643 2569 generic.go:358] "Generic (PLEG): container finished" podID="ec0ace3a-5d37-40cc-ba6d-d0413d21d79d" containerID="ecf2f75a4c6967c506af4ab94070ce3ba7f22fca8328ada8e1cf5704fb2628ff" exitCode=0 Apr 16 20:31:52.925789 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:52.925731 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" event={"ID":"ec0ace3a-5d37-40cc-ba6d-d0413d21d79d","Type":"ContainerDied","Data":"ecf2f75a4c6967c506af4ab94070ce3ba7f22fca8328ada8e1cf5704fb2628ff"} Apr 16 20:31:52.934560 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:52.934541 2569 scope.go:117] "RemoveContainer" containerID="3cf56d67778de5e9489b9772b5c67904f4642c8e6cb0fc59934411b42e49f22d" Apr 16 20:31:52.934824 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:31:52.934801 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cf56d67778de5e9489b9772b5c67904f4642c8e6cb0fc59934411b42e49f22d\": container with ID starting with 3cf56d67778de5e9489b9772b5c67904f4642c8e6cb0fc59934411b42e49f22d not found: ID does not exist" containerID="3cf56d67778de5e9489b9772b5c67904f4642c8e6cb0fc59934411b42e49f22d" Apr 16 20:31:52.934904 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:52.934839 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf56d67778de5e9489b9772b5c67904f4642c8e6cb0fc59934411b42e49f22d"} err="failed to get container status \"3cf56d67778de5e9489b9772b5c67904f4642c8e6cb0fc59934411b42e49f22d\": rpc error: code = NotFound desc = could not find container \"3cf56d67778de5e9489b9772b5c67904f4642c8e6cb0fc59934411b42e49f22d\": container with ID starting with 3cf56d67778de5e9489b9772b5c67904f4642c8e6cb0fc59934411b42e49f22d not found: ID does not exist" Apr 16 20:31:52.954280 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:52.952286 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8"] Apr 16 20:31:52.954280 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:52.952323 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8eac5-predictor-6c64cc44d7-7fgf8"] Apr 16 20:31:53.007160 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:53.007138 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" Apr 16 20:31:53.929709 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:53.929617 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" event={"ID":"ec0ace3a-5d37-40cc-ba6d-d0413d21d79d","Type":"ContainerDied","Data":"3f6c82bde72997198c1eea3497105fdab28e581223332eaa1d8ee4ebdc997f93"} Apr 16 20:31:53.929709 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:53.929635 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8" Apr 16 20:31:53.929709 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:53.929661 2569 scope.go:117] "RemoveContainer" containerID="ecf2f75a4c6967c506af4ab94070ce3ba7f22fca8328ada8e1cf5704fb2628ff" Apr 16 20:31:53.951166 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:53.951136 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8"] Apr 16 20:31:53.955206 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:53.955173 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8eac5-predictor-6ff8c47f86-d9nq8"] Apr 16 20:31:54.950036 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:54.950003 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71feaec2-df07-4dc8-a1dd-5ba0e77589c4" path="/var/lib/kubelet/pods/71feaec2-df07-4dc8-a1dd-5ba0e77589c4/volumes" Apr 16 20:31:54.950423 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:54.950274 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec0ace3a-5d37-40cc-ba6d-d0413d21d79d" path="/var/lib/kubelet/pods/ec0ace3a-5d37-40cc-ba6d-d0413d21d79d/volumes" Apr 16 20:31:55.791449 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:55.791405 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" podUID="76aa1c41-c651-43d6-97aa-7702d68bcbf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:31:55.791649 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:55.791405 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" podUID="dbd50b50-d77c-4d15-9788-4b3c481b1f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 20:31:59.912754 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:59.912702 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" podUID="f2fc63f4-a281-444a-9062-0a7929ab0c1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:31:59.913138 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:31:59.912702 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" podUID="790136dc-dcdc-4345-8950-63bf199ca8e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:32:05.792945 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:05.792913 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" Apr 16 20:32:05.793370 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:05.793029 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" Apr 16 20:32:09.912720 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:09.912678 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" podUID="790136dc-dcdc-4345-8950-63bf199ca8e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:32:09.913167 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:09.912680 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" podUID="f2fc63f4-a281-444a-9062-0a7929ab0c1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:32:19.912338 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:19.912287 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" podUID="f2fc63f4-a281-444a-9062-0a7929ab0c1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:32:19.912722 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:19.912295 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" podUID="790136dc-dcdc-4345-8950-63bf199ca8e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:32:29.912577 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:29.912529 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" podUID="f2fc63f4-a281-444a-9062-0a7929ab0c1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 20:32:29.912951 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:29.912537 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" podUID="790136dc-dcdc-4345-8950-63bf199ca8e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 20:32:34.377549 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.377463 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7"] Apr 16 20:32:34.377927 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.377859 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71feaec2-df07-4dc8-a1dd-5ba0e77589c4" containerName="kserve-container" Apr 16 20:32:34.377927 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.377871 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="71feaec2-df07-4dc8-a1dd-5ba0e77589c4" containerName="kserve-container" Apr 16 20:32:34.377927 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.377888 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec0ace3a-5d37-40cc-ba6d-d0413d21d79d" containerName="kserve-container" Apr 16 20:32:34.377927 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.377895 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec0ace3a-5d37-40cc-ba6d-d0413d21d79d" containerName="kserve-container" Apr 16 20:32:34.378050 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.377963 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="71feaec2-df07-4dc8-a1dd-5ba0e77589c4" containerName="kserve-container" Apr 16 20:32:34.378050 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.377971 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec0ace3a-5d37-40cc-ba6d-d0413d21d79d" containerName="kserve-container" Apr 16 20:32:34.380853 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.380836 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" Apr 16 20:32:34.390485 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.390454 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" Apr 16 20:32:34.395736 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.393537 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs"] Apr 16 20:32:34.395736 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.393818 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" podUID="76aa1c41-c651-43d6-97aa-7702d68bcbf9" containerName="kserve-container" containerID="cri-o://6f7224056ab6532df7a251bfaeeaed7eb8367bee6ccbf0ed83caebee03d8a77d" gracePeriod=30 Apr 16 20:32:34.402992 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.402964 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7"] Apr 16 20:32:34.511127 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.511096 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d"] Apr 16 20:32:34.511375 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.511350 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" podUID="dbd50b50-d77c-4d15-9788-4b3c481b1f7b" containerName="kserve-container" containerID="cri-o://fee1ce78132290645bee6ca2f0db62155bd68b003848332e0eed068c87de7a8f" gracePeriod=30 Apr 16 20:32:34.517616 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.517586 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn"] Apr 16 20:32:34.522435 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.522411 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7"] Apr 16 20:32:34.522567 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.522522 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" Apr 16 20:32:34.524837 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:32:34.524811 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9564e1c_a911_4775_a65e_6d3137def60a.slice/crio-58ccd03d7558d790be4b85fba680c79d0c4a9b3e4cb12b52fb98314c7f557f5c WatchSource:0}: Error finding container 58ccd03d7558d790be4b85fba680c79d0c4a9b3e4cb12b52fb98314c7f557f5c: Status 404 returned error can't find the container with id 58ccd03d7558d790be4b85fba680c79d0c4a9b3e4cb12b52fb98314c7f557f5c Apr 16 20:32:34.529653 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.529629 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn"] Apr 16 20:32:34.543465 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.543421 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" Apr 16 20:32:34.673450 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:34.673336 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn"] Apr 16 20:32:34.676655 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:32:34.676621 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdc70b46_a368_4fa7_ab84_44dfe556aeaa.slice/crio-c9c84bb94412f6f9f71e5e083f18c824bee621ed9ca1f716b7c5d57fbb61d21a WatchSource:0}: Error finding container c9c84bb94412f6f9f71e5e083f18c824bee621ed9ca1f716b7c5d57fbb61d21a: Status 404 returned error can't find the container with id c9c84bb94412f6f9f71e5e083f18c824bee621ed9ca1f716b7c5d57fbb61d21a Apr 16 20:32:35.073754 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:35.073716 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" event={"ID":"bdc70b46-a368-4fa7-ab84-44dfe556aeaa","Type":"ContainerStarted","Data":"ffc603ff4a39ae75bed590333bdeacf295b5c78ebdb5afd5fd87c71c19403858"} Apr 16 20:32:35.073965 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:35.073762 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" event={"ID":"bdc70b46-a368-4fa7-ab84-44dfe556aeaa","Type":"ContainerStarted","Data":"c9c84bb94412f6f9f71e5e083f18c824bee621ed9ca1f716b7c5d57fbb61d21a"} Apr 16 20:32:35.073965 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:35.073901 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" Apr 16 20:32:35.075199 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:35.075171 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" event={"ID":"a9564e1c-a911-4775-a65e-6d3137def60a","Type":"ContainerStarted","Data":"a63a447896268cf818151047c75ac2f0291f7b37b44564213ab8378193d5bd0e"} Apr 16 20:32:35.075356 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:35.075204 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" event={"ID":"a9564e1c-a911-4775-a65e-6d3137def60a","Type":"ContainerStarted","Data":"58ccd03d7558d790be4b85fba680c79d0c4a9b3e4cb12b52fb98314c7f557f5c"} Apr 16 20:32:35.075356 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:35.075235 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" Apr 16 20:32:35.075515 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:35.075492 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" podUID="bdc70b46-a368-4fa7-ab84-44dfe556aeaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 20:32:35.076280 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:35.076259 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" podUID="a9564e1c-a911-4775-a65e-6d3137def60a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:32:35.089435 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:35.089389 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" podStartSLOduration=1.089374925 podStartE2EDuration="1.089374925s" podCreationTimestamp="2026-04-16 20:32:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:32:35.087601163 +0000 UTC m=+1242.749198652" watchObservedRunningTime="2026-04-16 20:32:35.089374925 +0000 UTC m=+1242.750972413" Apr 16 20:32:35.102237 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:35.102165 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" podStartSLOduration=1.102147911 podStartE2EDuration="1.102147911s" podCreationTimestamp="2026-04-16 20:32:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:32:35.101341827 +0000 UTC m=+1242.762939316" watchObservedRunningTime="2026-04-16 20:32:35.102147911 +0000 UTC m=+1242.763745401" Apr 16 20:32:35.792022 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:35.791978 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" podUID="76aa1c41-c651-43d6-97aa-7702d68bcbf9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 20:32:35.792422 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:35.791978 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" podUID="dbd50b50-d77c-4d15-9788-4b3c481b1f7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 20:32:36.078756 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:36.078662 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" podUID="bdc70b46-a368-4fa7-ab84-44dfe556aeaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 20:32:36.078945 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:36.078753 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" podUID="a9564e1c-a911-4775-a65e-6d3137def60a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:32:38.542580 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:38.542553 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" Apr 16 20:32:39.090483 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.090449 2569 generic.go:358] "Generic (PLEG): container finished" podID="dbd50b50-d77c-4d15-9788-4b3c481b1f7b" containerID="fee1ce78132290645bee6ca2f0db62155bd68b003848332e0eed068c87de7a8f" exitCode=0 Apr 16 20:32:39.090737 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.090569 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" event={"ID":"dbd50b50-d77c-4d15-9788-4b3c481b1f7b","Type":"ContainerDied","Data":"fee1ce78132290645bee6ca2f0db62155bd68b003848332e0eed068c87de7a8f"} Apr 16 20:32:39.093271 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.093207 2569 generic.go:358] "Generic (PLEG): container finished" podID="76aa1c41-c651-43d6-97aa-7702d68bcbf9" containerID="6f7224056ab6532df7a251bfaeeaed7eb8367bee6ccbf0ed83caebee03d8a77d" exitCode=0 Apr 16 20:32:39.093527 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.093331 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" Apr 16 20:32:39.093611 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.093347 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" event={"ID":"76aa1c41-c651-43d6-97aa-7702d68bcbf9","Type":"ContainerDied","Data":"6f7224056ab6532df7a251bfaeeaed7eb8367bee6ccbf0ed83caebee03d8a77d"} Apr 16 20:32:39.093668 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.093625 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs" event={"ID":"76aa1c41-c651-43d6-97aa-7702d68bcbf9","Type":"ContainerDied","Data":"11c90d008554b1222bbbe4c95b83ddeef2b05de3e2adb5806e261042015ca011"} Apr 16 20:32:39.093668 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.093649 2569 scope.go:117] "RemoveContainer" containerID="6f7224056ab6532df7a251bfaeeaed7eb8367bee6ccbf0ed83caebee03d8a77d" Apr 16 20:32:39.103959 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.103934 2569 scope.go:117] "RemoveContainer" containerID="6f7224056ab6532df7a251bfaeeaed7eb8367bee6ccbf0ed83caebee03d8a77d" Apr 16 20:32:39.104347 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:32:39.104318 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f7224056ab6532df7a251bfaeeaed7eb8367bee6ccbf0ed83caebee03d8a77d\": container with ID starting with 6f7224056ab6532df7a251bfaeeaed7eb8367bee6ccbf0ed83caebee03d8a77d not found: ID does not exist" containerID="6f7224056ab6532df7a251bfaeeaed7eb8367bee6ccbf0ed83caebee03d8a77d" Apr 16 20:32:39.104427 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.104356 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f7224056ab6532df7a251bfaeeaed7eb8367bee6ccbf0ed83caebee03d8a77d"} err="failed to get container status \"6f7224056ab6532df7a251bfaeeaed7eb8367bee6ccbf0ed83caebee03d8a77d\": rpc error: code = NotFound desc = could not find container \"6f7224056ab6532df7a251bfaeeaed7eb8367bee6ccbf0ed83caebee03d8a77d\": container with ID starting with 6f7224056ab6532df7a251bfaeeaed7eb8367bee6ccbf0ed83caebee03d8a77d not found: ID does not exist" Apr 16 20:32:39.109277 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.109250 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs"] Apr 16 20:32:39.114105 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.114078 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-01ef2-predictor-6c594797db-sg4fs"] Apr 16 20:32:39.145976 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.145954 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" Apr 16 20:32:39.913271 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.913240 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" Apr 16 20:32:39.913657 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:39.913395 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" Apr 16 20:32:40.100378 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:40.100335 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" event={"ID":"dbd50b50-d77c-4d15-9788-4b3c481b1f7b","Type":"ContainerDied","Data":"c81e8e9b1dcfb092f0d5b1c818c3faaffa5e2dbbc1d0e96565e09e745f080c72"} Apr 16 20:32:40.100378 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:40.100360 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d" Apr 16 20:32:40.100617 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:40.100394 2569 scope.go:117] "RemoveContainer" containerID="fee1ce78132290645bee6ca2f0db62155bd68b003848332e0eed068c87de7a8f" Apr 16 20:32:40.123576 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:40.123544 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d"] Apr 16 20:32:40.126769 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:40.126745 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-01ef2-predictor-bf7749ff5-7qs4d"] Apr 16 20:32:40.949644 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:40.949609 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76aa1c41-c651-43d6-97aa-7702d68bcbf9" path="/var/lib/kubelet/pods/76aa1c41-c651-43d6-97aa-7702d68bcbf9/volumes" Apr 16 20:32:40.950007 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:40.949848 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd50b50-d77c-4d15-9788-4b3c481b1f7b" path="/var/lib/kubelet/pods/dbd50b50-d77c-4d15-9788-4b3c481b1f7b/volumes" Apr 16 20:32:46.078711 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:46.078668 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" podUID="a9564e1c-a911-4775-a65e-6d3137def60a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:32:46.079168 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:46.078669 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" podUID="bdc70b46-a368-4fa7-ab84-44dfe556aeaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 20:32:56.078994 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:56.078947 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" podUID="a9564e1c-a911-4775-a65e-6d3137def60a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:32:56.079408 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:32:56.078947 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" podUID="bdc70b46-a368-4fa7-ab84-44dfe556aeaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 20:33:06.079416 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:33:06.079377 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" podUID="a9564e1c-a911-4775-a65e-6d3137def60a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:33:06.079810 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:33:06.079377 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" podUID="bdc70b46-a368-4fa7-ab84-44dfe556aeaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 20:33:16.079327 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:33:16.079283 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" podUID="a9564e1c-a911-4775-a65e-6d3137def60a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 20:33:16.079713 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:33:16.079283 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" podUID="bdc70b46-a368-4fa7-ab84-44dfe556aeaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 20:33:26.079587 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:33:26.079550 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" Apr 16 20:33:26.080041 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:33:26.079991 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" Apr 16 20:41:13.465330 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.465293 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf"] Apr 16 20:41:13.467824 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.465558 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" podUID="f2fc63f4-a281-444a-9062-0a7929ab0c1d" containerName="kserve-container" containerID="cri-o://6ab1d81d1ccb5adad620aadf5ae23b1584b01be281fd3adbfcdc84c88d1a40b4" gracePeriod=30 Apr 16 20:41:13.511319 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.511279 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx"] Apr 16 20:41:13.511825 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.511807 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76aa1c41-c651-43d6-97aa-7702d68bcbf9" containerName="kserve-container" Apr 16 20:41:13.511879 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.511830 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="76aa1c41-c651-43d6-97aa-7702d68bcbf9" containerName="kserve-container" Apr 16 20:41:13.511879 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.511842 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbd50b50-d77c-4d15-9788-4b3c481b1f7b" containerName="kserve-container" Apr 16 20:41:13.511879 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.511853 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd50b50-d77c-4d15-9788-4b3c481b1f7b" containerName="kserve-container" Apr 16 20:41:13.511971 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.511957 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="dbd50b50-d77c-4d15-9788-4b3c481b1f7b" containerName="kserve-container" Apr 16 20:41:13.512004 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.511975 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="76aa1c41-c651-43d6-97aa-7702d68bcbf9" containerName="kserve-container" Apr 16 20:41:13.522680 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.522348 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" Apr 16 20:41:13.534260 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.534236 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" Apr 16 20:41:13.543091 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.543062 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx"] Apr 16 20:41:13.579115 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.579080 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6"] Apr 16 20:41:13.579367 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.579344 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" podUID="790136dc-dcdc-4345-8950-63bf199ca8e1" containerName="kserve-container" containerID="cri-o://35c43f65e82a0f41568feb822d6f35de60d59802cb2d5d497aaf4406046c0f42" gracePeriod=30 Apr 16 20:41:13.590857 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.590830 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q"] Apr 16 20:41:13.594482 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.594458 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" Apr 16 20:41:13.608488 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.608438 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q"] Apr 16 20:41:13.610822 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.610397 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" Apr 16 20:41:13.690991 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.690966 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx"] Apr 16 20:41:13.693779 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:41:13.693749 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbadb1357_93f0_417d_87b5_53cea188b186.slice/crio-74bd2fe3ffd32f2ce47bfe2cd96abe92b1109f6b31fb780e2b3ccaa345e35a92 WatchSource:0}: Error finding container 74bd2fe3ffd32f2ce47bfe2cd96abe92b1109f6b31fb780e2b3ccaa345e35a92: Status 404 returned error can't find the container with id 74bd2fe3ffd32f2ce47bfe2cd96abe92b1109f6b31fb780e2b3ccaa345e35a92 Apr 16 20:41:13.696057 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.696036 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:41:13.757134 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.757111 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q"] Apr 16 20:41:13.759084 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:41:13.759047 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32ceae94_9582_4d3e_90eb_82fe49daba42.slice/crio-4a75794b5fdf8ce4176c7bd543ad1ab63b0863f077b70ffd8e863a4de0c3e23e WatchSource:0}: Error finding container 4a75794b5fdf8ce4176c7bd543ad1ab63b0863f077b70ffd8e863a4de0c3e23e: Status 404 returned error can't find the container with id 4a75794b5fdf8ce4176c7bd543ad1ab63b0863f077b70ffd8e863a4de0c3e23e Apr 16 20:41:13.810806 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.810772 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" event={"ID":"badb1357-93f0-417d-87b5-53cea188b186","Type":"ContainerStarted","Data":"a77558d8baee73c671ea3ce849ea9c534f8475925728d730851e7c42d3368754"} Apr 16 20:41:13.810978 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.810904 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" Apr 16 20:41:13.810978 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.810926 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" event={"ID":"badb1357-93f0-417d-87b5-53cea188b186","Type":"ContainerStarted","Data":"74bd2fe3ffd32f2ce47bfe2cd96abe92b1109f6b31fb780e2b3ccaa345e35a92"} Apr 16 20:41:13.811992 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.811968 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" event={"ID":"32ceae94-9582-4d3e-90eb-82fe49daba42","Type":"ContainerStarted","Data":"4a75794b5fdf8ce4176c7bd543ad1ab63b0863f077b70ffd8e863a4de0c3e23e"} Apr 16 20:41:13.812168 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.812140 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" podUID="badb1357-93f0-417d-87b5-53cea188b186" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 20:41:13.851952 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:13.851904 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" podStartSLOduration=0.851888746 podStartE2EDuration="851.888746ms" podCreationTimestamp="2026-04-16 20:41:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:41:13.85000214 +0000 UTC m=+1761.511599629" watchObservedRunningTime="2026-04-16 20:41:13.851888746 +0000 UTC m=+1761.513486234" Apr 16 20:41:14.817172 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:14.817131 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" event={"ID":"32ceae94-9582-4d3e-90eb-82fe49daba42","Type":"ContainerStarted","Data":"9878c956a983525dc7eb4e0936dee22550714386630d36c82516a4691c882e48"} Apr 16 20:41:14.817654 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:14.817414 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" Apr 16 20:41:14.817654 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:14.817499 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" podUID="badb1357-93f0-417d-87b5-53cea188b186" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 20:41:14.818742 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:14.818720 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" podUID="32ceae94-9582-4d3e-90eb-82fe49daba42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 20:41:14.837814 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:14.837771 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" podStartSLOduration=1.837759786 podStartE2EDuration="1.837759786s" podCreationTimestamp="2026-04-16 20:41:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:41:14.836120895 +0000 UTC m=+1762.497718385" watchObservedRunningTime="2026-04-16 20:41:14.837759786 +0000 UTC m=+1762.499357275" Apr 16 20:41:15.822715 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:15.822678 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" podUID="32ceae94-9582-4d3e-90eb-82fe49daba42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 20:41:16.826718 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:16.826678 2569 generic.go:358] "Generic (PLEG): container finished" podID="f2fc63f4-a281-444a-9062-0a7929ab0c1d" containerID="6ab1d81d1ccb5adad620aadf5ae23b1584b01be281fd3adbfcdc84c88d1a40b4" exitCode=0 Apr 16 20:41:16.827065 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:16.826745 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" event={"ID":"f2fc63f4-a281-444a-9062-0a7929ab0c1d","Type":"ContainerDied","Data":"6ab1d81d1ccb5adad620aadf5ae23b1584b01be281fd3adbfcdc84c88d1a40b4"} Apr 16 20:41:16.917385 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:16.917360 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" Apr 16 20:41:17.121629 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.121605 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" Apr 16 20:41:17.831568 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.831529 2569 generic.go:358] "Generic (PLEG): container finished" podID="790136dc-dcdc-4345-8950-63bf199ca8e1" containerID="35c43f65e82a0f41568feb822d6f35de60d59802cb2d5d497aaf4406046c0f42" exitCode=0 Apr 16 20:41:17.831994 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.831595 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" Apr 16 20:41:17.831994 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.831614 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" event={"ID":"790136dc-dcdc-4345-8950-63bf199ca8e1","Type":"ContainerDied","Data":"35c43f65e82a0f41568feb822d6f35de60d59802cb2d5d497aaf4406046c0f42"} Apr 16 20:41:17.831994 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.831650 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6" event={"ID":"790136dc-dcdc-4345-8950-63bf199ca8e1","Type":"ContainerDied","Data":"15b05d876b243857e7854468fa8c3423805dce291aa3920b772397faa40e9fbc"} Apr 16 20:41:17.831994 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.831669 2569 scope.go:117] "RemoveContainer" containerID="35c43f65e82a0f41568feb822d6f35de60d59802cb2d5d497aaf4406046c0f42" Apr 16 20:41:17.832864 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.832790 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" event={"ID":"f2fc63f4-a281-444a-9062-0a7929ab0c1d","Type":"ContainerDied","Data":"29a714c4fbc70eb73968aa92ed9e7b8129021a9baeb7bb7e9664b75f6ac5f482"} Apr 16 20:41:17.832864 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.832825 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf" Apr 16 20:41:17.841918 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.841647 2569 scope.go:117] "RemoveContainer" containerID="35c43f65e82a0f41568feb822d6f35de60d59802cb2d5d497aaf4406046c0f42" Apr 16 20:41:17.842258 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:41:17.842232 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35c43f65e82a0f41568feb822d6f35de60d59802cb2d5d497aaf4406046c0f42\": container with ID starting with 35c43f65e82a0f41568feb822d6f35de60d59802cb2d5d497aaf4406046c0f42 not found: ID does not exist" containerID="35c43f65e82a0f41568feb822d6f35de60d59802cb2d5d497aaf4406046c0f42" Apr 16 20:41:17.842382 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.842268 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35c43f65e82a0f41568feb822d6f35de60d59802cb2d5d497aaf4406046c0f42"} err="failed to get container status \"35c43f65e82a0f41568feb822d6f35de60d59802cb2d5d497aaf4406046c0f42\": rpc error: code = NotFound desc = could not find container \"35c43f65e82a0f41568feb822d6f35de60d59802cb2d5d497aaf4406046c0f42\": container with ID starting with 35c43f65e82a0f41568feb822d6f35de60d59802cb2d5d497aaf4406046c0f42 not found: ID does not exist" Apr 16 20:41:17.842382 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.842287 2569 scope.go:117] "RemoveContainer" containerID="6ab1d81d1ccb5adad620aadf5ae23b1584b01be281fd3adbfcdc84c88d1a40b4" Apr 16 20:41:17.849629 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.849606 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf"] Apr 16 20:41:17.852036 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.852013 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b8212-predictor-598b577b-bw5tf"] Apr 16 20:41:17.859496 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.859474 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6"] Apr 16 20:41:17.861825 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:17.861806 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-b8212-predictor-58b6586fc-kb7m6"] Apr 16 20:41:18.950728 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:18.950690 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790136dc-dcdc-4345-8950-63bf199ca8e1" path="/var/lib/kubelet/pods/790136dc-dcdc-4345-8950-63bf199ca8e1/volumes" Apr 16 20:41:18.951073 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:18.950968 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2fc63f4-a281-444a-9062-0a7929ab0c1d" path="/var/lib/kubelet/pods/f2fc63f4-a281-444a-9062-0a7929ab0c1d/volumes" Apr 16 20:41:24.818448 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:24.818403 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" podUID="badb1357-93f0-417d-87b5-53cea188b186" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 20:41:25.823661 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:25.823610 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" podUID="32ceae94-9582-4d3e-90eb-82fe49daba42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 20:41:34.818462 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:34.818369 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" podUID="badb1357-93f0-417d-87b5-53cea188b186" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 20:41:35.822945 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:35.822904 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" podUID="32ceae94-9582-4d3e-90eb-82fe49daba42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 20:41:44.818343 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:44.818298 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" podUID="badb1357-93f0-417d-87b5-53cea188b186" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 20:41:45.822826 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:45.822782 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" podUID="32ceae94-9582-4d3e-90eb-82fe49daba42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 20:41:54.818043 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:54.817999 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" podUID="badb1357-93f0-417d-87b5-53cea188b186" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 20:41:55.823558 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:55.823515 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" podUID="32ceae94-9582-4d3e-90eb-82fe49daba42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 20:41:59.303935 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.303887 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7"] Apr 16 20:41:59.304335 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.304104 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" podUID="a9564e1c-a911-4775-a65e-6d3137def60a" containerName="kserve-container" containerID="cri-o://a63a447896268cf818151047c75ac2f0291f7b37b44564213ab8378193d5bd0e" gracePeriod=30 Apr 16 20:41:59.351305 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.351269 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx"] Apr 16 20:41:59.351659 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.351646 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="790136dc-dcdc-4345-8950-63bf199ca8e1" containerName="kserve-container" Apr 16 20:41:59.351703 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.351660 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="790136dc-dcdc-4345-8950-63bf199ca8e1" containerName="kserve-container" Apr 16 20:41:59.351703 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.351670 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2fc63f4-a281-444a-9062-0a7929ab0c1d" containerName="kserve-container" Apr 16 20:41:59.351703 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.351676 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2fc63f4-a281-444a-9062-0a7929ab0c1d" containerName="kserve-container" Apr 16 20:41:59.351798 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.351740 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="790136dc-dcdc-4345-8950-63bf199ca8e1" containerName="kserve-container" Apr 16 20:41:59.351798 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.351752 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2fc63f4-a281-444a-9062-0a7929ab0c1d" containerName="kserve-container" Apr 16 20:41:59.355958 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.355937 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" Apr 16 20:41:59.364726 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.364565 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx"] Apr 16 20:41:59.369651 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.369624 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" Apr 16 20:41:59.397369 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.397337 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn"] Apr 16 20:41:59.397791 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.397599 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" podUID="bdc70b46-a368-4fa7-ab84-44dfe556aeaa" containerName="kserve-container" containerID="cri-o://ffc603ff4a39ae75bed590333bdeacf295b5c78ebdb5afd5fd87c71c19403858" gracePeriod=30 Apr 16 20:41:59.447158 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.447116 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m"] Apr 16 20:41:59.452278 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.452251 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" Apr 16 20:41:59.465056 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.464587 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" Apr 16 20:41:59.474404 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.469404 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m"] Apr 16 20:41:59.524796 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.524760 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx"] Apr 16 20:41:59.526739 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:41:59.526690 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25c668a5_81e3_47be_87d1_261dfeb91853.slice/crio-d687d44e3aeb84105ebde5e161b6910c1c22df199b98f747f34b4a3035addf83 WatchSource:0}: Error finding container d687d44e3aeb84105ebde5e161b6910c1c22df199b98f747f34b4a3035addf83: Status 404 returned error can't find the container with id d687d44e3aeb84105ebde5e161b6910c1c22df199b98f747f34b4a3035addf83 Apr 16 20:41:59.617625 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.617599 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m"] Apr 16 20:41:59.619685 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:41:59.619653 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dd77c35_c20e_4c23_9193_68c30d8d8f14.slice/crio-86c2f713476529daa73832249988704854444e59f8e9208821bb9c25eff1b071 WatchSource:0}: Error finding container 86c2f713476529daa73832249988704854444e59f8e9208821bb9c25eff1b071: Status 404 returned error can't find the container with id 86c2f713476529daa73832249988704854444e59f8e9208821bb9c25eff1b071 Apr 16 20:41:59.972889 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.972799 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" event={"ID":"2dd77c35-c20e-4c23-9193-68c30d8d8f14","Type":"ContainerStarted","Data":"470fce86d0bd14377220bf292b031e4a2d299cba0b12f5d1651f186720b382ad"} Apr 16 20:41:59.972889 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.972845 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" event={"ID":"2dd77c35-c20e-4c23-9193-68c30d8d8f14","Type":"ContainerStarted","Data":"86c2f713476529daa73832249988704854444e59f8e9208821bb9c25eff1b071"} Apr 16 20:41:59.973114 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.973021 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" Apr 16 20:41:59.974314 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.974286 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" event={"ID":"25c668a5-81e3-47be-87d1-261dfeb91853","Type":"ContainerStarted","Data":"b9256a168d1a6fe0fed9d7fa92b98431e60e41c2652ae703923a33a46aea5e8a"} Apr 16 20:41:59.974314 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.974318 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" event={"ID":"25c668a5-81e3-47be-87d1-261dfeb91853","Type":"ContainerStarted","Data":"d687d44e3aeb84105ebde5e161b6910c1c22df199b98f747f34b4a3035addf83"} Apr 16 20:41:59.974495 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.974443 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" Apr 16 20:41:59.974721 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.974699 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" podUID="2dd77c35-c20e-4c23-9193-68c30d8d8f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 20:41:59.975348 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.975328 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" podUID="25c668a5-81e3-47be-87d1-261dfeb91853" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 20:41:59.987046 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:41:59.987000 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" podStartSLOduration=0.986989254 podStartE2EDuration="986.989254ms" podCreationTimestamp="2026-04-16 20:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:41:59.986454172 +0000 UTC m=+1807.648051675" watchObservedRunningTime="2026-04-16 20:41:59.986989254 +0000 UTC m=+1807.648586743" Apr 16 20:42:00.000788 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:00.000750 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" podStartSLOduration=1.000724522 podStartE2EDuration="1.000724522s" podCreationTimestamp="2026-04-16 20:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:41:59.999636066 +0000 UTC m=+1807.661233554" watchObservedRunningTime="2026-04-16 20:42:00.000724522 +0000 UTC m=+1807.662322012" Apr 16 20:42:00.977590 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:00.977544 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" podUID="2dd77c35-c20e-4c23-9193-68c30d8d8f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 20:42:00.978046 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:00.977684 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" podUID="25c668a5-81e3-47be-87d1-261dfeb91853" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 20:42:03.555530 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:03.555502 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" Apr 16 20:42:03.852737 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:03.852716 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" Apr 16 20:42:03.987638 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:03.987602 2569 generic.go:358] "Generic (PLEG): container finished" podID="bdc70b46-a368-4fa7-ab84-44dfe556aeaa" containerID="ffc603ff4a39ae75bed590333bdeacf295b5c78ebdb5afd5fd87c71c19403858" exitCode=0 Apr 16 20:42:03.987799 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:03.987670 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" Apr 16 20:42:03.987799 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:03.987681 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" event={"ID":"bdc70b46-a368-4fa7-ab84-44dfe556aeaa","Type":"ContainerDied","Data":"ffc603ff4a39ae75bed590333bdeacf295b5c78ebdb5afd5fd87c71c19403858"} Apr 16 20:42:03.987799 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:03.987718 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn" event={"ID":"bdc70b46-a368-4fa7-ab84-44dfe556aeaa","Type":"ContainerDied","Data":"c9c84bb94412f6f9f71e5e083f18c824bee621ed9ca1f716b7c5d57fbb61d21a"} Apr 16 20:42:03.987799 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:03.987736 2569 scope.go:117] "RemoveContainer" containerID="ffc603ff4a39ae75bed590333bdeacf295b5c78ebdb5afd5fd87c71c19403858" Apr 16 20:42:03.988822 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:03.988801 2569 generic.go:358] "Generic (PLEG): container finished" podID="a9564e1c-a911-4775-a65e-6d3137def60a" containerID="a63a447896268cf818151047c75ac2f0291f7b37b44564213ab8378193d5bd0e" exitCode=0 Apr 16 20:42:03.988885 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:03.988851 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" event={"ID":"a9564e1c-a911-4775-a65e-6d3137def60a","Type":"ContainerDied","Data":"a63a447896268cf818151047c75ac2f0291f7b37b44564213ab8378193d5bd0e"} Apr 16 20:42:03.988885 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:03.988862 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" Apr 16 20:42:03.988885 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:03.988875 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7" event={"ID":"a9564e1c-a911-4775-a65e-6d3137def60a","Type":"ContainerDied","Data":"58ccd03d7558d790be4b85fba680c79d0c4a9b3e4cb12b52fb98314c7f557f5c"} Apr 16 20:42:03.999690 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:03.999667 2569 scope.go:117] "RemoveContainer" containerID="ffc603ff4a39ae75bed590333bdeacf295b5c78ebdb5afd5fd87c71c19403858" Apr 16 20:42:04.000066 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:42:03.999967 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc603ff4a39ae75bed590333bdeacf295b5c78ebdb5afd5fd87c71c19403858\": container with ID starting with ffc603ff4a39ae75bed590333bdeacf295b5c78ebdb5afd5fd87c71c19403858 not found: ID does not exist" containerID="ffc603ff4a39ae75bed590333bdeacf295b5c78ebdb5afd5fd87c71c19403858" Apr 16 20:42:04.000066 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:03.999995 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc603ff4a39ae75bed590333bdeacf295b5c78ebdb5afd5fd87c71c19403858"} err="failed to get container status \"ffc603ff4a39ae75bed590333bdeacf295b5c78ebdb5afd5fd87c71c19403858\": rpc error: code = NotFound desc = could not find container \"ffc603ff4a39ae75bed590333bdeacf295b5c78ebdb5afd5fd87c71c19403858\": container with ID starting with ffc603ff4a39ae75bed590333bdeacf295b5c78ebdb5afd5fd87c71c19403858 not found: ID does not exist" Apr 16 20:42:04.000066 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:04.000012 2569 scope.go:117] "RemoveContainer" containerID="a63a447896268cf818151047c75ac2f0291f7b37b44564213ab8378193d5bd0e" Apr 16 20:42:04.008386 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:04.008365 2569 scope.go:117] "RemoveContainer" containerID="a63a447896268cf818151047c75ac2f0291f7b37b44564213ab8378193d5bd0e" Apr 16 20:42:04.008623 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:42:04.008602 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a63a447896268cf818151047c75ac2f0291f7b37b44564213ab8378193d5bd0e\": container with ID starting with a63a447896268cf818151047c75ac2f0291f7b37b44564213ab8378193d5bd0e not found: ID does not exist" containerID="a63a447896268cf818151047c75ac2f0291f7b37b44564213ab8378193d5bd0e" Apr 16 20:42:04.008659 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:04.008635 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a63a447896268cf818151047c75ac2f0291f7b37b44564213ab8378193d5bd0e"} err="failed to get container status \"a63a447896268cf818151047c75ac2f0291f7b37b44564213ab8378193d5bd0e\": rpc error: code = NotFound desc = could not find container \"a63a447896268cf818151047c75ac2f0291f7b37b44564213ab8378193d5bd0e\": container with ID starting with a63a447896268cf818151047c75ac2f0291f7b37b44564213ab8378193d5bd0e not found: ID does not exist" Apr 16 20:42:04.012771 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:04.012747 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7"] Apr 16 20:42:04.018176 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:04.018153 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cc0f2-predictor-56bbbc6bdb-mmsf7"] Apr 16 20:42:04.027976 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:04.027951 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn"] Apr 16 20:42:04.031929 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:04.031909 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cc0f2-predictor-55b9dd659f-bvmfn"] Apr 16 20:42:04.819057 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:04.819023 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" Apr 16 20:42:04.949765 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:04.949733 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9564e1c-a911-4775-a65e-6d3137def60a" path="/var/lib/kubelet/pods/a9564e1c-a911-4775-a65e-6d3137def60a/volumes" Apr 16 20:42:04.949966 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:04.949953 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc70b46-a368-4fa7-ab84-44dfe556aeaa" path="/var/lib/kubelet/pods/bdc70b46-a368-4fa7-ab84-44dfe556aeaa/volumes" Apr 16 20:42:05.824425 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:05.824388 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" Apr 16 20:42:10.978268 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:10.978205 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" podUID="25c668a5-81e3-47be-87d1-261dfeb91853" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 20:42:10.978636 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:10.978205 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" podUID="2dd77c35-c20e-4c23-9193-68c30d8d8f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 20:42:20.977877 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:20.977831 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" podUID="25c668a5-81e3-47be-87d1-261dfeb91853" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 20:42:20.978383 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:20.977827 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" podUID="2dd77c35-c20e-4c23-9193-68c30d8d8f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 20:42:30.978066 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:30.978019 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" podUID="25c668a5-81e3-47be-87d1-261dfeb91853" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 20:42:30.978447 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:30.978019 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" podUID="2dd77c35-c20e-4c23-9193-68c30d8d8f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 20:42:33.792660 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.792624 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx"] Apr 16 20:42:33.793053 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.792861 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" podUID="badb1357-93f0-417d-87b5-53cea188b186" containerName="kserve-container" containerID="cri-o://a77558d8baee73c671ea3ce849ea9c534f8475925728d730851e7c42d3368754" gracePeriod=30 Apr 16 20:42:33.819879 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.819841 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp"] Apr 16 20:42:33.820278 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.820260 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9564e1c-a911-4775-a65e-6d3137def60a" containerName="kserve-container" Apr 16 20:42:33.820365 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.820280 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9564e1c-a911-4775-a65e-6d3137def60a" containerName="kserve-container" Apr 16 20:42:33.820365 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.820300 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdc70b46-a368-4fa7-ab84-44dfe556aeaa" containerName="kserve-container" Apr 16 20:42:33.820365 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.820308 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc70b46-a368-4fa7-ab84-44dfe556aeaa" containerName="kserve-container" Apr 16 20:42:33.820513 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.820412 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdc70b46-a368-4fa7-ab84-44dfe556aeaa" containerName="kserve-container" Apr 16 20:42:33.820513 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.820428 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9564e1c-a911-4775-a65e-6d3137def60a" containerName="kserve-container" Apr 16 20:42:33.823404 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.823381 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" Apr 16 20:42:33.832899 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.832881 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" Apr 16 20:42:33.838821 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.838792 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp"] Apr 16 20:42:33.922797 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.922764 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d"] Apr 16 20:42:33.927011 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.926985 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" Apr 16 20:42:33.928581 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.928413 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q"] Apr 16 20:42:33.928765 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.928662 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" podUID="32ceae94-9582-4d3e-90eb-82fe49daba42" containerName="kserve-container" containerID="cri-o://9878c956a983525dc7eb4e0936dee22550714386630d36c82516a4691c882e48" gracePeriod=30 Apr 16 20:42:33.936545 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.936356 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d"] Apr 16 20:42:33.941484 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.941463 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" Apr 16 20:42:33.989740 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:33.989705 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp"] Apr 16 20:42:33.994492 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:42:33.994427 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e2f511d_226f_4bad_953c_ba86ce555dd5.slice/crio-14d48657d848c99764c5ce38d8a9cb24e671878cabbe20bc0137524a75ec1a19 WatchSource:0}: Error finding container 14d48657d848c99764c5ce38d8a9cb24e671878cabbe20bc0137524a75ec1a19: Status 404 returned error can't find the container with id 14d48657d848c99764c5ce38d8a9cb24e671878cabbe20bc0137524a75ec1a19 Apr 16 20:42:34.089927 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:34.089727 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d"] Apr 16 20:42:34.091985 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:34.091938 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" event={"ID":"3e2f511d-226f-4bad-953c-ba86ce555dd5","Type":"ContainerStarted","Data":"a959fc6328ce29b744caa8fa119bdf16a931420f77c60de516c3ab6045877847"} Apr 16 20:42:34.092084 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:34.091991 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" event={"ID":"3e2f511d-226f-4bad-953c-ba86ce555dd5","Type":"ContainerStarted","Data":"14d48657d848c99764c5ce38d8a9cb24e671878cabbe20bc0137524a75ec1a19"} Apr 16 20:42:34.092315 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:34.092294 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" Apr 16 20:42:34.092714 ip-10-0-135-182 kubenswrapper[2569]: W0416 20:42:34.092693 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0af49bda_d9f4_4028_a6df_4f383d4d4902.slice/crio-f9a97764e9dd1899386843ed25fd8cc3e544c0d533c49c8bcb5254b5c8d0a84a WatchSource:0}: Error finding container f9a97764e9dd1899386843ed25fd8cc3e544c0d533c49c8bcb5254b5c8d0a84a: Status 404 returned error can't find the container with id f9a97764e9dd1899386843ed25fd8cc3e544c0d533c49c8bcb5254b5c8d0a84a Apr 16 20:42:34.093744 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:34.093699 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" podUID="3e2f511d-226f-4bad-953c-ba86ce555dd5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 20:42:34.107811 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:34.107772 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" podStartSLOduration=1.107759472 podStartE2EDuration="1.107759472s" podCreationTimestamp="2026-04-16 20:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:42:34.106139252 +0000 UTC m=+1841.767736741" watchObservedRunningTime="2026-04-16 20:42:34.107759472 +0000 UTC m=+1841.769356961" Apr 16 20:42:34.818283 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:34.818208 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" podUID="badb1357-93f0-417d-87b5-53cea188b186" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 16 20:42:35.097322 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:35.097210 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" event={"ID":"0af49bda-d9f4-4028-a6df-4f383d4d4902","Type":"ContainerStarted","Data":"3055dbbdedf6428f33888c62926bb96b68911d149200655d6a36e4990ac93612"} Apr 16 20:42:35.097322 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:35.097276 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" event={"ID":"0af49bda-d9f4-4028-a6df-4f383d4d4902","Type":"ContainerStarted","Data":"f9a97764e9dd1899386843ed25fd8cc3e544c0d533c49c8bcb5254b5c8d0a84a"} Apr 16 20:42:35.097530 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:35.097474 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" Apr 16 20:42:35.097692 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:35.097662 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" podUID="3e2f511d-226f-4bad-953c-ba86ce555dd5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 20:42:35.098605 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:35.098575 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" podUID="0af49bda-d9f4-4028-a6df-4f383d4d4902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 20:42:35.112914 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:35.112871 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" podStartSLOduration=2.112857101 podStartE2EDuration="2.112857101s" podCreationTimestamp="2026-04-16 20:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:42:35.111042701 +0000 UTC m=+1842.772640190" watchObservedRunningTime="2026-04-16 20:42:35.112857101 +0000 UTC m=+1842.774454590" Apr 16 20:42:35.823615 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:35.823568 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" podUID="32ceae94-9582-4d3e-90eb-82fe49daba42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 16 20:42:36.104351 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:36.104257 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" podUID="0af49bda-d9f4-4028-a6df-4f383d4d4902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 20:42:37.399203 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:37.399179 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" Apr 16 20:42:37.402421 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:37.402400 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" Apr 16 20:42:38.110547 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.110510 2569 generic.go:358] "Generic (PLEG): container finished" podID="32ceae94-9582-4d3e-90eb-82fe49daba42" containerID="9878c956a983525dc7eb4e0936dee22550714386630d36c82516a4691c882e48" exitCode=0 Apr 16 20:42:38.110720 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.110554 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" event={"ID":"32ceae94-9582-4d3e-90eb-82fe49daba42","Type":"ContainerDied","Data":"9878c956a983525dc7eb4e0936dee22550714386630d36c82516a4691c882e48"} Apr 16 20:42:38.110720 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.110574 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" Apr 16 20:42:38.110720 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.110590 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q" event={"ID":"32ceae94-9582-4d3e-90eb-82fe49daba42","Type":"ContainerDied","Data":"4a75794b5fdf8ce4176c7bd543ad1ab63b0863f077b70ffd8e863a4de0c3e23e"} Apr 16 20:42:38.110720 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.110612 2569 scope.go:117] "RemoveContainer" containerID="9878c956a983525dc7eb4e0936dee22550714386630d36c82516a4691c882e48" Apr 16 20:42:38.111709 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.111684 2569 generic.go:358] "Generic (PLEG): container finished" podID="badb1357-93f0-417d-87b5-53cea188b186" containerID="a77558d8baee73c671ea3ce849ea9c534f8475925728d730851e7c42d3368754" exitCode=0 Apr 16 20:42:38.111817 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.111755 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" Apr 16 20:42:38.111817 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.111766 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" event={"ID":"badb1357-93f0-417d-87b5-53cea188b186","Type":"ContainerDied","Data":"a77558d8baee73c671ea3ce849ea9c534f8475925728d730851e7c42d3368754"} Apr 16 20:42:38.111817 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.111801 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx" event={"ID":"badb1357-93f0-417d-87b5-53cea188b186","Type":"ContainerDied","Data":"74bd2fe3ffd32f2ce47bfe2cd96abe92b1109f6b31fb780e2b3ccaa345e35a92"} Apr 16 20:42:38.120425 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.120401 2569 scope.go:117] "RemoveContainer" containerID="9878c956a983525dc7eb4e0936dee22550714386630d36c82516a4691c882e48" Apr 16 20:42:38.120680 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:42:38.120664 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9878c956a983525dc7eb4e0936dee22550714386630d36c82516a4691c882e48\": container with ID starting with 9878c956a983525dc7eb4e0936dee22550714386630d36c82516a4691c882e48 not found: ID does not exist" containerID="9878c956a983525dc7eb4e0936dee22550714386630d36c82516a4691c882e48" Apr 16 20:42:38.120755 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.120687 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9878c956a983525dc7eb4e0936dee22550714386630d36c82516a4691c882e48"} err="failed to get container status \"9878c956a983525dc7eb4e0936dee22550714386630d36c82516a4691c882e48\": rpc error: code = NotFound desc = could not find container \"9878c956a983525dc7eb4e0936dee22550714386630d36c82516a4691c882e48\": container with ID starting with 9878c956a983525dc7eb4e0936dee22550714386630d36c82516a4691c882e48 not found: ID does not exist" Apr 16 20:42:38.120755 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.120702 2569 scope.go:117] "RemoveContainer" containerID="a77558d8baee73c671ea3ce849ea9c534f8475925728d730851e7c42d3368754" Apr 16 20:42:38.128006 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.127989 2569 scope.go:117] "RemoveContainer" containerID="a77558d8baee73c671ea3ce849ea9c534f8475925728d730851e7c42d3368754" Apr 16 20:42:38.128292 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:42:38.128273 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a77558d8baee73c671ea3ce849ea9c534f8475925728d730851e7c42d3368754\": container with ID starting with a77558d8baee73c671ea3ce849ea9c534f8475925728d730851e7c42d3368754 not found: ID does not exist" containerID="a77558d8baee73c671ea3ce849ea9c534f8475925728d730851e7c42d3368754" Apr 16 20:42:38.128350 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.128302 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77558d8baee73c671ea3ce849ea9c534f8475925728d730851e7c42d3368754"} err="failed to get container status \"a77558d8baee73c671ea3ce849ea9c534f8475925728d730851e7c42d3368754\": rpc error: code = NotFound desc = could not find container \"a77558d8baee73c671ea3ce849ea9c534f8475925728d730851e7c42d3368754\": container with ID starting with a77558d8baee73c671ea3ce849ea9c534f8475925728d730851e7c42d3368754 not found: ID does not exist" Apr 16 20:42:38.136271 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.134578 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q"] Apr 16 20:42:38.140727 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.138684 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-932a3-predictor-5b4b84fcb7-jwg8q"] Apr 16 20:42:38.152757 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.152732 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx"] Apr 16 20:42:38.156823 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.156798 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-932a3-predictor-7586c4f994-hg9tx"] Apr 16 20:42:38.949826 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.949790 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ceae94-9582-4d3e-90eb-82fe49daba42" path="/var/lib/kubelet/pods/32ceae94-9582-4d3e-90eb-82fe49daba42/volumes" Apr 16 20:42:38.950330 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:38.950161 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="badb1357-93f0-417d-87b5-53cea188b186" path="/var/lib/kubelet/pods/badb1357-93f0-417d-87b5-53cea188b186/volumes" Apr 16 20:42:40.978317 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:40.978275 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" podUID="25c668a5-81e3-47be-87d1-261dfeb91853" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 20:42:40.978694 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:40.978278 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" podUID="2dd77c35-c20e-4c23-9193-68c30d8d8f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 20:42:45.098719 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:45.098668 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" podUID="3e2f511d-226f-4bad-953c-ba86ce555dd5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 20:42:46.104613 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:46.104562 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" podUID="0af49bda-d9f4-4028-a6df-4f383d4d4902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 20:42:50.979155 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:50.979122 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" Apr 16 20:42:50.979564 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:50.979177 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" Apr 16 20:42:55.097897 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:55.097840 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" podUID="3e2f511d-226f-4bad-953c-ba86ce555dd5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 20:42:56.105135 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:42:56.105086 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" podUID="0af49bda-d9f4-4028-a6df-4f383d4d4902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 20:43:05.098087 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:43:05.097984 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" podUID="3e2f511d-226f-4bad-953c-ba86ce555dd5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 20:43:06.104742 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:43:06.104697 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" podUID="0af49bda-d9f4-4028-a6df-4f383d4d4902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 20:43:15.098668 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:43:15.098616 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" podUID="3e2f511d-226f-4bad-953c-ba86ce555dd5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 20:43:16.105203 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:43:16.105159 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" podUID="0af49bda-d9f4-4028-a6df-4f383d4d4902" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 20:43:25.099152 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:43:25.099115 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" Apr 16 20:43:26.105879 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:43:26.105846 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" Apr 16 20:51:58.697578 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:51:58.697542 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp"] Apr 16 20:51:58.700036 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:51:58.697777 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" podUID="3e2f511d-226f-4bad-953c-ba86ce555dd5" containerName="kserve-container" containerID="cri-o://a959fc6328ce29b744caa8fa119bdf16a931420f77c60de516c3ab6045877847" gracePeriod=30 Apr 16 20:51:58.741752 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:51:58.741710 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d"] Apr 16 20:51:58.742003 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:51:58.741960 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" podUID="0af49bda-d9f4-4028-a6df-4f383d4d4902" containerName="kserve-container" containerID="cri-o://3055dbbdedf6428f33888c62926bb96b68911d149200655d6a36e4990ac93612" gracePeriod=30 Apr 16 20:52:01.995588 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:01.995553 2569 generic.go:358] "Generic (PLEG): container finished" podID="3e2f511d-226f-4bad-953c-ba86ce555dd5" containerID="a959fc6328ce29b744caa8fa119bdf16a931420f77c60de516c3ab6045877847" exitCode=0 Apr 16 20:52:01.995940 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:01.995620 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" event={"ID":"3e2f511d-226f-4bad-953c-ba86ce555dd5","Type":"ContainerDied","Data":"a959fc6328ce29b744caa8fa119bdf16a931420f77c60de516c3ab6045877847"} Apr 16 20:52:02.040660 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:02.040637 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" Apr 16 20:52:02.676364 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:02.676343 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" Apr 16 20:52:03.000364 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.000326 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" event={"ID":"3e2f511d-226f-4bad-953c-ba86ce555dd5","Type":"ContainerDied","Data":"14d48657d848c99764c5ce38d8a9cb24e671878cabbe20bc0137524a75ec1a19"} Apr 16 20:52:03.000364 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.000347 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp" Apr 16 20:52:03.000364 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.000370 2569 scope.go:117] "RemoveContainer" containerID="a959fc6328ce29b744caa8fa119bdf16a931420f77c60de516c3ab6045877847" Apr 16 20:52:03.001527 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.001509 2569 generic.go:358] "Generic (PLEG): container finished" podID="0af49bda-d9f4-4028-a6df-4f383d4d4902" containerID="3055dbbdedf6428f33888c62926bb96b68911d149200655d6a36e4990ac93612" exitCode=0 Apr 16 20:52:03.001603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.001567 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" Apr 16 20:52:03.001603 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.001585 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" event={"ID":"0af49bda-d9f4-4028-a6df-4f383d4d4902","Type":"ContainerDied","Data":"3055dbbdedf6428f33888c62926bb96b68911d149200655d6a36e4990ac93612"} Apr 16 20:52:03.001701 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.001612 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d" event={"ID":"0af49bda-d9f4-4028-a6df-4f383d4d4902","Type":"ContainerDied","Data":"f9a97764e9dd1899386843ed25fd8cc3e544c0d533c49c8bcb5254b5c8d0a84a"} Apr 16 20:52:03.008668 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.008646 2569 scope.go:117] "RemoveContainer" containerID="3055dbbdedf6428f33888c62926bb96b68911d149200655d6a36e4990ac93612" Apr 16 20:52:03.015884 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.015866 2569 scope.go:117] "RemoveContainer" containerID="3055dbbdedf6428f33888c62926bb96b68911d149200655d6a36e4990ac93612" Apr 16 20:52:03.016132 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:52:03.016113 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3055dbbdedf6428f33888c62926bb96b68911d149200655d6a36e4990ac93612\": container with ID starting with 3055dbbdedf6428f33888c62926bb96b68911d149200655d6a36e4990ac93612 not found: ID does not exist" containerID="3055dbbdedf6428f33888c62926bb96b68911d149200655d6a36e4990ac93612" Apr 16 20:52:03.016176 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.016144 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3055dbbdedf6428f33888c62926bb96b68911d149200655d6a36e4990ac93612"} err="failed to get container status \"3055dbbdedf6428f33888c62926bb96b68911d149200655d6a36e4990ac93612\": rpc error: code = NotFound desc = could not find container \"3055dbbdedf6428f33888c62926bb96b68911d149200655d6a36e4990ac93612\": container with ID starting with 3055dbbdedf6428f33888c62926bb96b68911d149200655d6a36e4990ac93612 not found: ID does not exist" Apr 16 20:52:03.019365 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.019343 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp"] Apr 16 20:52:03.028450 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.028426 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ee4eb-predictor-6cdf96cdcb-q4jmp"] Apr 16 20:52:03.039078 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.039058 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d"] Apr 16 20:52:03.045291 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:03.045269 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ee4eb-predictor-6b8659d7f7-l4v4d"] Apr 16 20:52:04.949503 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:04.949468 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0af49bda-d9f4-4028-a6df-4f383d4d4902" path="/var/lib/kubelet/pods/0af49bda-d9f4-4028-a6df-4f383d4d4902/volumes" Apr 16 20:52:04.949946 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:52:04.949746 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e2f511d-226f-4bad-953c-ba86ce555dd5" path="/var/lib/kubelet/pods/3e2f511d-226f-4bad-953c-ba86ce555dd5/volumes" Apr 16 20:59:28.868614 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:28.868580 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx"] Apr 16 20:59:28.870968 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:28.868815 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" podUID="25c668a5-81e3-47be-87d1-261dfeb91853" containerName="kserve-container" containerID="cri-o://b9256a168d1a6fe0fed9d7fa92b98431e60e41c2652ae703923a33a46aea5e8a" gracePeriod=30 Apr 16 20:59:28.933530 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:28.933495 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m"] Apr 16 20:59:28.933765 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:28.933727 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" podUID="2dd77c35-c20e-4c23-9193-68c30d8d8f14" containerName="kserve-container" containerID="cri-o://470fce86d0bd14377220bf292b031e4a2d299cba0b12f5d1651f186720b382ad" gracePeriod=30 Apr 16 20:59:30.978571 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:30.978521 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" podUID="25c668a5-81e3-47be-87d1-261dfeb91853" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 20:59:30.979025 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:30.978524 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" podUID="2dd77c35-c20e-4c23-9193-68c30d8d8f14" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 20:59:32.275441 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:32.275416 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" Apr 16 20:59:32.472033 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:32.471934 2569 generic.go:358] "Generic (PLEG): container finished" podID="2dd77c35-c20e-4c23-9193-68c30d8d8f14" containerID="470fce86d0bd14377220bf292b031e4a2d299cba0b12f5d1651f186720b382ad" exitCode=0 Apr 16 20:59:32.472033 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:32.472001 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" Apr 16 20:59:32.472287 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:32.472028 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" event={"ID":"2dd77c35-c20e-4c23-9193-68c30d8d8f14","Type":"ContainerDied","Data":"470fce86d0bd14377220bf292b031e4a2d299cba0b12f5d1651f186720b382ad"} Apr 16 20:59:32.472287 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:32.472076 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m" event={"ID":"2dd77c35-c20e-4c23-9193-68c30d8d8f14","Type":"ContainerDied","Data":"86c2f713476529daa73832249988704854444e59f8e9208821bb9c25eff1b071"} Apr 16 20:59:32.472287 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:32.472097 2569 scope.go:117] "RemoveContainer" containerID="470fce86d0bd14377220bf292b031e4a2d299cba0b12f5d1651f186720b382ad" Apr 16 20:59:32.480484 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:32.480465 2569 scope.go:117] "RemoveContainer" containerID="470fce86d0bd14377220bf292b031e4a2d299cba0b12f5d1651f186720b382ad" Apr 16 20:59:32.480747 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:59:32.480725 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"470fce86d0bd14377220bf292b031e4a2d299cba0b12f5d1651f186720b382ad\": container with ID starting with 470fce86d0bd14377220bf292b031e4a2d299cba0b12f5d1651f186720b382ad not found: ID does not exist" containerID="470fce86d0bd14377220bf292b031e4a2d299cba0b12f5d1651f186720b382ad" Apr 16 20:59:32.480831 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:32.480754 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470fce86d0bd14377220bf292b031e4a2d299cba0b12f5d1651f186720b382ad"} err="failed to get container status \"470fce86d0bd14377220bf292b031e4a2d299cba0b12f5d1651f186720b382ad\": rpc error: code = NotFound desc = could not find container \"470fce86d0bd14377220bf292b031e4a2d299cba0b12f5d1651f186720b382ad\": container with ID starting with 470fce86d0bd14377220bf292b031e4a2d299cba0b12f5d1651f186720b382ad not found: ID does not exist" Apr 16 20:59:32.492279 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:32.492253 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m"] Apr 16 20:59:32.496092 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:32.496073 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a03d4-predictor-794c46dd55-49r9m"] Apr 16 20:59:32.949960 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:32.949925 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dd77c35-c20e-4c23-9193-68c30d8d8f14" path="/var/lib/kubelet/pods/2dd77c35-c20e-4c23-9193-68c30d8d8f14/volumes" Apr 16 20:59:35.909919 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:35.909893 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" Apr 16 20:59:36.485958 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:36.485923 2569 generic.go:358] "Generic (PLEG): container finished" podID="25c668a5-81e3-47be-87d1-261dfeb91853" containerID="b9256a168d1a6fe0fed9d7fa92b98431e60e41c2652ae703923a33a46aea5e8a" exitCode=0 Apr 16 20:59:36.486162 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:36.485979 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" event={"ID":"25c668a5-81e3-47be-87d1-261dfeb91853","Type":"ContainerDied","Data":"b9256a168d1a6fe0fed9d7fa92b98431e60e41c2652ae703923a33a46aea5e8a"} Apr 16 20:59:36.486162 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:36.485993 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" Apr 16 20:59:36.486162 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:36.486022 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx" event={"ID":"25c668a5-81e3-47be-87d1-261dfeb91853","Type":"ContainerDied","Data":"d687d44e3aeb84105ebde5e161b6910c1c22df199b98f747f34b4a3035addf83"} Apr 16 20:59:36.486162 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:36.486039 2569 scope.go:117] "RemoveContainer" containerID="b9256a168d1a6fe0fed9d7fa92b98431e60e41c2652ae703923a33a46aea5e8a" Apr 16 20:59:36.493942 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:36.493924 2569 scope.go:117] "RemoveContainer" containerID="b9256a168d1a6fe0fed9d7fa92b98431e60e41c2652ae703923a33a46aea5e8a" Apr 16 20:59:36.494201 ip-10-0-135-182 kubenswrapper[2569]: E0416 20:59:36.494179 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9256a168d1a6fe0fed9d7fa92b98431e60e41c2652ae703923a33a46aea5e8a\": container with ID starting with b9256a168d1a6fe0fed9d7fa92b98431e60e41c2652ae703923a33a46aea5e8a not found: ID does not exist" containerID="b9256a168d1a6fe0fed9d7fa92b98431e60e41c2652ae703923a33a46aea5e8a" Apr 16 20:59:36.494276 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:36.494210 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9256a168d1a6fe0fed9d7fa92b98431e60e41c2652ae703923a33a46aea5e8a"} err="failed to get container status \"b9256a168d1a6fe0fed9d7fa92b98431e60e41c2652ae703923a33a46aea5e8a\": rpc error: code = NotFound desc = could not find container \"b9256a168d1a6fe0fed9d7fa92b98431e60e41c2652ae703923a33a46aea5e8a\": container with ID starting with b9256a168d1a6fe0fed9d7fa92b98431e60e41c2652ae703923a33a46aea5e8a not found: ID does not exist" Apr 16 20:59:36.505938 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:36.505913 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx"] Apr 16 20:59:36.507785 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:36.507763 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a03d4-predictor-7b7c8c8654-scwmx"] Apr 16 20:59:36.950107 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:36.950029 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25c668a5-81e3-47be-87d1-261dfeb91853" path="/var/lib/kubelet/pods/25c668a5-81e3-47be-87d1-261dfeb91853/volumes" Apr 16 20:59:57.398496 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:57.398464 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qhfz4_64a16ceb-71a8-4203-b254-e617d6a240e4/global-pull-secret-syncer/0.log" Apr 16 20:59:57.568579 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:57.568548 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pdzfv_72d3b5c6-036b-4c05-9113-913e25110e3c/konnectivity-agent/0.log" Apr 16 20:59:57.671888 ip-10-0-135-182 kubenswrapper[2569]: I0416 20:59:57.671811 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-182.ec2.internal_2a31b11c9cf44c8a66bd679136b463d4/haproxy/0.log" Apr 16 21:00:01.313257 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.313208 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d15b1732-83c2-484a-aeb4-cbf9de572514/alertmanager/0.log" Apr 16 21:00:01.338371 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.338344 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d15b1732-83c2-484a-aeb4-cbf9de572514/config-reloader/0.log" Apr 16 21:00:01.362463 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.362442 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d15b1732-83c2-484a-aeb4-cbf9de572514/kube-rbac-proxy-web/0.log" Apr 16 21:00:01.387819 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.387800 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d15b1732-83c2-484a-aeb4-cbf9de572514/kube-rbac-proxy/0.log" Apr 16 21:00:01.409956 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.409940 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d15b1732-83c2-484a-aeb4-cbf9de572514/kube-rbac-proxy-metric/0.log" Apr 16 21:00:01.434609 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.434591 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d15b1732-83c2-484a-aeb4-cbf9de572514/prom-label-proxy/0.log" Apr 16 21:00:01.455559 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.455541 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d15b1732-83c2-484a-aeb4-cbf9de572514/init-config-reloader/0.log" Apr 16 21:00:01.515233 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.515190 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-nbdcc_dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1/kube-state-metrics/0.log" Apr 16 21:00:01.535905 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.535882 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-nbdcc_dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1/kube-rbac-proxy-main/0.log" Apr 16 21:00:01.563917 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.563860 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-nbdcc_dfff4d3b-2ab4-4a3e-a829-fc1bae4421d1/kube-rbac-proxy-self/0.log" Apr 16 21:00:01.729547 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.729518 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ftphk_f649df33-f768-4559-8efb-4679bd198e57/node-exporter/0.log" Apr 16 21:00:01.752125 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.752100 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ftphk_f649df33-f768-4559-8efb-4679bd198e57/kube-rbac-proxy/0.log" Apr 16 21:00:01.776493 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.776469 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ftphk_f649df33-f768-4559-8efb-4679bd198e57/init-textfile/0.log" Apr 16 21:00:01.874115 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.874044 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5rbs2_61e1abe3-3b21-40a0-9ec5-c4dbf542029e/kube-rbac-proxy-main/0.log" Apr 16 21:00:01.897776 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.897747 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5rbs2_61e1abe3-3b21-40a0-9ec5-c4dbf542029e/kube-rbac-proxy-self/0.log" Apr 16 21:00:01.920378 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:01.920355 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5rbs2_61e1abe3-3b21-40a0-9ec5-c4dbf542029e/openshift-state-metrics/0.log" Apr 16 21:00:02.265585 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:02.265559 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-659db59bd6-6x6bt_eaa06018-7d0b-4e97-8eab-d295413238eb/telemeter-client/0.log" Apr 16 21:00:02.292511 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:02.292491 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-659db59bd6-6x6bt_eaa06018-7d0b-4e97-8eab-d295413238eb/reload/0.log" Apr 16 21:00:02.328020 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:02.327996 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-659db59bd6-6x6bt_eaa06018-7d0b-4e97-8eab-d295413238eb/kube-rbac-proxy/0.log" Apr 16 21:00:04.169361 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:04.169336 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-778bf4cfc7-d9xn5_a861d4fc-8efc-499b-81fb-b1101b20ea23/console/0.log" Apr 16 21:00:05.196103 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.196075 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zf7nk_c5be8917-e623-43fd-9af9-3ccbaba1d169/dns/0.log" Apr 16 21:00:05.215142 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.215117 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zf7nk_c5be8917-e623-43fd-9af9-3ccbaba1d169/kube-rbac-proxy/0.log" Apr 16 21:00:05.295067 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.295041 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zbk8x_ee08a763-0e02-4cc9-a7fc-f2422edc681b/dns-node-resolver/0.log" Apr 16 21:00:05.682831 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.682802 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-658f999b89-l7k42_43c8d38b-37ab-4156-91da-345b7bf10494/registry/0.log" Apr 16 21:00:05.724615 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.724581 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s"] Apr 16 21:00:05.724995 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.724978 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0af49bda-d9f4-4028-a6df-4f383d4d4902" containerName="kserve-container" Apr 16 21:00:05.725081 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.724997 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af49bda-d9f4-4028-a6df-4f383d4d4902" containerName="kserve-container" Apr 16 21:00:05.725081 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725010 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dd77c35-c20e-4c23-9193-68c30d8d8f14" containerName="kserve-container" Apr 16 21:00:05.725081 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725018 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd77c35-c20e-4c23-9193-68c30d8d8f14" containerName="kserve-container" Apr 16 21:00:05.725081 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725036 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32ceae94-9582-4d3e-90eb-82fe49daba42" containerName="kserve-container" Apr 16 21:00:05.725081 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725044 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ceae94-9582-4d3e-90eb-82fe49daba42" containerName="kserve-container" Apr 16 21:00:05.725081 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725060 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25c668a5-81e3-47be-87d1-261dfeb91853" containerName="kserve-container" Apr 16 21:00:05.725081 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725069 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c668a5-81e3-47be-87d1-261dfeb91853" containerName="kserve-container" Apr 16 21:00:05.725081 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725078 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="badb1357-93f0-417d-87b5-53cea188b186" containerName="kserve-container" Apr 16 21:00:05.725497 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725086 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="badb1357-93f0-417d-87b5-53cea188b186" containerName="kserve-container" Apr 16 21:00:05.725497 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725109 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e2f511d-226f-4bad-953c-ba86ce555dd5" containerName="kserve-container" Apr 16 21:00:05.725497 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725118 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2f511d-226f-4bad-953c-ba86ce555dd5" containerName="kserve-container" Apr 16 21:00:05.725497 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725190 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="25c668a5-81e3-47be-87d1-261dfeb91853" containerName="kserve-container" Apr 16 21:00:05.725497 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725203 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0af49bda-d9f4-4028-a6df-4f383d4d4902" containerName="kserve-container" Apr 16 21:00:05.725497 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725239 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2dd77c35-c20e-4c23-9193-68c30d8d8f14" containerName="kserve-container" Apr 16 21:00:05.725497 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725255 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e2f511d-226f-4bad-953c-ba86ce555dd5" containerName="kserve-container" Apr 16 21:00:05.725497 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725265 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="32ceae94-9582-4d3e-90eb-82fe49daba42" containerName="kserve-container" Apr 16 21:00:05.725497 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.725275 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="badb1357-93f0-417d-87b5-53cea188b186" containerName="kserve-container" Apr 16 21:00:05.729732 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.729711 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.731653 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.731630 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wcf77\"/\"default-dockercfg-rx7m6\"" Apr 16 21:00:05.731754 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.731632 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wcf77\"/\"kube-root-ca.crt\"" Apr 16 21:00:05.732362 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.732348 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wcf77\"/\"openshift-service-ca.crt\"" Apr 16 21:00:05.735133 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.735109 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s"] Apr 16 21:00:05.743548 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.743529 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-f5hp5_85487890-a028-49b2-b173-0f3bef2f3039/node-ca/0.log" Apr 16 21:00:05.776903 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.776881 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-podres\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.777015 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.776912 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-lib-modules\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.777015 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.776954 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg28t\" (UniqueName: \"kubernetes.io/projected/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-kube-api-access-sg28t\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.777015 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.776991 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-sys\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.777015 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.777006 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-proc\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.877733 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.877706 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-podres\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.877884 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.877739 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-lib-modules\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.877884 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.877781 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg28t\" (UniqueName: \"kubernetes.io/projected/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-kube-api-access-sg28t\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.877884 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.877842 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-sys\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.877884 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.877863 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-proc\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.877884 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.877874 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-podres\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.878077 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.877880 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-lib-modules\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.878077 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.877942 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-proc\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.878077 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.877953 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-sys\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:05.885331 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:05.885303 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg28t\" (UniqueName: \"kubernetes.io/projected/0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69-kube-api-access-sg28t\") pod \"perf-node-gather-daemonset-7792s\" (UID: \"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:06.040159 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:06.040136 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:06.160357 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:06.160334 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s"] Apr 16 21:00:06.162820 ip-10-0-135-182 kubenswrapper[2569]: W0416 21:00:06.162785 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0e2afce2_1a27_4bd4_8a3a_7787aaf0ca69.slice/crio-c1b1bfe5c7c5047e6e422b074c4578cd169b2d0d94dd189715bbf2dc069908fe WatchSource:0}: Error finding container c1b1bfe5c7c5047e6e422b074c4578cd169b2d0d94dd189715bbf2dc069908fe: Status 404 returned error can't find the container with id c1b1bfe5c7c5047e6e422b074c4578cd169b2d0d94dd189715bbf2dc069908fe Apr 16 21:00:06.164258 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:06.164243 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:00:06.579013 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:06.578979 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" event={"ID":"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69","Type":"ContainerStarted","Data":"1eb4ccd2d4557be43c823a9d4dac2063fd76a9229cd88dd8568ee51f604acea2"} Apr 16 21:00:06.579013 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:06.579017 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" event={"ID":"0e2afce2-1a27-4bd4-8a3a-7787aaf0ca69","Type":"ContainerStarted","Data":"c1b1bfe5c7c5047e6e422b074c4578cd169b2d0d94dd189715bbf2dc069908fe"} Apr 16 21:00:06.579453 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:06.579050 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:06.594339 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:06.594297 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" podStartSLOduration=1.59428333 podStartE2EDuration="1.59428333s" podCreationTimestamp="2026-04-16 21:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:00:06.59275369 +0000 UTC m=+2894.254351182" watchObservedRunningTime="2026-04-16 21:00:06.59428333 +0000 UTC m=+2894.255880819" Apr 16 21:00:06.799054 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:06.799023 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-sxqhm_7b290614-0660-49a1-a6ac-a56ab51a99b4/serve-healthcheck-canary/0.log" Apr 16 21:00:07.308469 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:07.308440 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nzqv4_ad308773-bd17-4488-9a1d-78314d278c1a/kube-rbac-proxy/0.log" Apr 16 21:00:07.329648 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:07.329624 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nzqv4_ad308773-bd17-4488-9a1d-78314d278c1a/exporter/0.log" Apr 16 21:00:07.349065 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:07.349038 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nzqv4_ad308773-bd17-4488-9a1d-78314d278c1a/extractor/0.log" Apr 16 21:00:09.271251 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:09.271201 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-66cf78b85b-7bv8g_d068de2e-40fd-4c8d-9e8c-c68a8cb12247/manager/0.log" Apr 16 21:00:09.301524 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:09.301497 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-2zvw8_f18acdb1-e8a1-48b6-a585-ac48f51dca0d/manager/0.log" Apr 16 21:00:12.592357 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:12.592332 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-7792s" Apr 16 21:00:14.566365 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:14.566337 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zmm9_01eeb0f7-ee5b-44af-ab8f-b3296ae5b886/kube-multus-additional-cni-plugins/0.log" Apr 16 21:00:14.602128 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:14.602095 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zmm9_01eeb0f7-ee5b-44af-ab8f-b3296ae5b886/egress-router-binary-copy/0.log" Apr 16 21:00:14.641098 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:14.641076 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zmm9_01eeb0f7-ee5b-44af-ab8f-b3296ae5b886/cni-plugins/0.log" Apr 16 21:00:14.684085 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:14.684065 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zmm9_01eeb0f7-ee5b-44af-ab8f-b3296ae5b886/bond-cni-plugin/0.log" Apr 16 21:00:14.727111 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:14.727054 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zmm9_01eeb0f7-ee5b-44af-ab8f-b3296ae5b886/routeoverride-cni/0.log" Apr 16 21:00:14.775062 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:14.775041 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zmm9_01eeb0f7-ee5b-44af-ab8f-b3296ae5b886/whereabouts-cni-bincopy/0.log" Apr 16 21:00:14.814191 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:14.814160 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zmm9_01eeb0f7-ee5b-44af-ab8f-b3296ae5b886/whereabouts-cni/0.log" Apr 16 21:00:15.491861 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:15.491839 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k66bc_b37ad374-9da6-4bb9-ab54-e87d0ccf8712/kube-multus/0.log" Apr 16 21:00:15.525627 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:15.525601 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ck2ww_f2865cec-958e-49f5-9bd1-57d8fbb3fefc/network-metrics-daemon/0.log" Apr 16 21:00:15.561346 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:15.561326 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ck2ww_f2865cec-958e-49f5-9bd1-57d8fbb3fefc/kube-rbac-proxy/0.log" Apr 16 21:00:16.671281 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:16.671251 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kmsm8_0cbc952a-810f-46b7-b791-bccdd61ac1b4/ovn-controller/0.log" Apr 16 21:00:16.701648 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:16.701577 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kmsm8_0cbc952a-810f-46b7-b791-bccdd61ac1b4/ovn-acl-logging/0.log" Apr 16 21:00:16.718817 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:16.718799 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kmsm8_0cbc952a-810f-46b7-b791-bccdd61ac1b4/kube-rbac-proxy-node/0.log" Apr 16 21:00:16.742736 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:16.742714 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kmsm8_0cbc952a-810f-46b7-b791-bccdd61ac1b4/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 21:00:16.764537 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:16.764511 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kmsm8_0cbc952a-810f-46b7-b791-bccdd61ac1b4/northd/0.log" Apr 16 21:00:16.784861 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:16.784844 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kmsm8_0cbc952a-810f-46b7-b791-bccdd61ac1b4/nbdb/0.log" Apr 16 21:00:16.808671 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:16.808654 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kmsm8_0cbc952a-810f-46b7-b791-bccdd61ac1b4/sbdb/0.log" Apr 16 21:00:16.909417 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:16.909385 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kmsm8_0cbc952a-810f-46b7-b791-bccdd61ac1b4/ovnkube-controller/0.log" Apr 16 21:00:18.251265 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:18.251207 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bg95d_4ec7ab1f-5d4d-4ae6-9ba5-216bc7dd364f/network-check-target-container/0.log" Apr 16 21:00:19.150113 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:19.150083 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-42vjt_c7d8e3b9-d6d9-447c-91b1-b9d4184f699e/iptables-alerter/0.log" Apr 16 21:00:19.854725 ip-10-0-135-182 kubenswrapper[2569]: I0416 21:00:19.854699 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7sgxp_f73138f6-787e-4ee9-b196-b914563cad39/tuned/0.log"