Apr 16 18:09:59.165547 ip-10-0-143-51 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:09:59.645557 ip-10-0-143-51 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:59.645557 ip-10-0-143-51 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:09:59.645557 ip-10-0-143-51 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:59.645557 ip-10-0-143-51 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:09:59.645557 ip-10-0-143-51 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:59.647132 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.646984 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:09:59.650387 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650366 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:59.650387 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650383 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:59.650387 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650389 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:59.650387 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650393 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650398 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650403 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650407 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650411 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650416 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650421 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650424 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650428 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650432 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650436 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650442 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650446 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650450 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650454 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650458 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650462 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650466 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650470 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650474 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:59.650634 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650478 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650482 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650486 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650490 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650494 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650498 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650502 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650506 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650510 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650514 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650518 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650522 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650527 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650531 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650541 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650546 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650551 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650555 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650560 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650564 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:59.651471 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650569 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650574 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650578 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650583 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650597 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650601 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650606 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650612 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650617 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650621 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650626 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650630 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650634 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650638 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650642 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650647 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650651 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650656 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650661 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:59.652186 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650665 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650673 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650679 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650685 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650689 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650694 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650699 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650704 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650709 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650714 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650719 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650723 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650727 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650735 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650742 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650746 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650751 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650755 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650761 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:59.652735 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650766 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650771 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650775 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650779 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.650784 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651449 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651458 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651464 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651468 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651472 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651477 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651481 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651485 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651490 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651494 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651499 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651503 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651508 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651512 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651516 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:59.653200 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651522 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651527 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651532 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651536 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651540 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651545 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651549 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651553 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651558 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651562 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651566 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651572 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651576 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651580 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651584 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651589 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651594 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651598 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651603 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651610 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:59.653707 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651617 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651622 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651627 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651632 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651636 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651641 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651646 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651650 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651654 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651658 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651663 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651667 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651672 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651677 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651681 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651687 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651693 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651697 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651702 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:59.654279 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651706 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651710 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651714 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651719 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651723 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651728 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651733 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651737 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651742 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651746 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651750 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651754 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651758 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651762 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651767 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651772 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651776 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651780 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651786 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651790 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:59.654747 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651794 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651799 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651803 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651809 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651813 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651817 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651821 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651826 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651830 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651834 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651838 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.651842 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655002 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655027 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655038 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655047 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655055 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655060 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655070 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655079 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655085 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:09:59.655462 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655091 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655097 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655102 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655108 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655113 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655119 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655125 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655130 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655136 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655141 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655150 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655156 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655162 2576 flags.go:64] FLAG: --config-dir="" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655168 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655174 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655182 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655188 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655193 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655199 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655205 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655211 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655216 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655222 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655227 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655234 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:09:59.656369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655254 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655260 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655265 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655271 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655276 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655285 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655291 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655296 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655301 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655306 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655314 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655320 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655325 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655331 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655337 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655343 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655348 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655354 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655359 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655365 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655370 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655377 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655383 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655389 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655395 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655400 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:09:59.657075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655406 2576 flags.go:64] FLAG: --help="false" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655411 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-143-51.ec2.internal" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655417 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655422 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655427 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655433 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655440 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655445 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655450 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655455 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655460 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655467 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655478 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655483 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655489 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655494 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655500 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655506 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655511 2576 flags.go:64] FLAG: --lock-file="" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655517 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655522 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655528 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655537 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:09:59.657813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655543 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655549 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655555 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655561 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655567 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655572 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655577 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655593 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655599 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655607 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655612 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655617 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655623 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655629 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655634 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655639 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655645 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655658 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655664 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655670 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655675 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655680 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655692 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655697 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:09:59.658677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655703 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655708 2576 flags.go:64] FLAG: --port="10250" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655713 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655718 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e555f4514b4e214d" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655724 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655730 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655735 2576 flags.go:64] FLAG: --register-node="true" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655740 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655745 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655753 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655759 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655764 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655769 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655776 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655782 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655787 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655793 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655798 2576 flags.go:64] FLAG: --runonce="false" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655804 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655810 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655815 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655820 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655825 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655831 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655837 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655843 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:09:59.659296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655848 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655853 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655858 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655864 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655869 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655876 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655881 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655891 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655896 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655901 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655909 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655914 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655920 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655925 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655930 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655936 2576 flags.go:64] FLAG: --v="2" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655942 2576 flags.go:64] FLAG: --version="false" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655950 2576 flags.go:64] FLAG: --vmodule="" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655957 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.655963 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656121 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656128 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656134 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656139 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:59.660022 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656143 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656149 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656154 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656159 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656164 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656172 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656176 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656181 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656186 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656190 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656195 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656201 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656206 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656211 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656217 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656221 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656225 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656230 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656236 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656258 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:59.660630 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656264 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656268 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656273 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656278 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656282 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656287 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656292 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656297 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656301 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656306 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656310 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656315 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656319 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656323 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656328 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656332 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656336 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656343 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656348 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656352 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:59.661147 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656357 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656361 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656366 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656371 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656375 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656380 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656384 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656390 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656395 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656399 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656404 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656409 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656413 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656418 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656422 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656427 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656431 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656436 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656440 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656445 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:59.661669 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656449 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656454 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656459 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656463 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656468 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656475 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656480 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656485 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656489 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656496 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656501 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656505 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656512 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656519 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656524 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656529 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656533 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656538 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656544 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:59.662217 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656548 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:59.662755 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656554 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:59.662755 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.656559 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:59.662755 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.657263 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:59.665748 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.665727 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:09:59.665748 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.665747 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665797 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665803 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665807 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665810 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665813 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665816 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665819 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665821 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665824 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665827 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665830 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665833 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665836 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665839 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665842 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665844 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665847 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665849 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665852 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:59.665853 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665855 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665857 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665860 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665863 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665866 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665869 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665872 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665875 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665877 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665880 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665883 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665885 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665888 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665891 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665894 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665896 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665899 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665901 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665904 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665906 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:59.666389 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665909 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665911 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665913 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665917 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665921 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665924 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665926 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665929 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665931 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665934 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665936 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665939 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665942 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665944 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665948 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665952 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665957 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665960 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665963 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:59.666888 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665966 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665969 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665971 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665974 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665977 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665980 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665982 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665985 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665987 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665990 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665992 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665995 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.665998 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666000 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666003 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666005 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666008 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666010 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666013 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666016 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:59.667370 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666019 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666021 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666024 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666026 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666029 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666032 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666035 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666038 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.666043 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666163 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666168 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666171 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666174 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666177 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666179 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:59.667869 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666182 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666184 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666187 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666189 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666192 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666195 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666197 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666200 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666203 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666206 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666214 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666218 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666222 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666255 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666261 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666265 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666271 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666274 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666277 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:59.668336 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666280 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666283 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666286 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666289 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666292 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666295 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666299 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666302 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666305 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666307 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666310 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666312 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666315 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666317 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666320 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666322 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666325 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666328 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666330 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666333 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:59.668827 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666335 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666338 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666340 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666343 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666345 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666348 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666350 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666353 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666355 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666358 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666360 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666363 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666366 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666368 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666371 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666373 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666376 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666379 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666381 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666384 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:59.669346 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666386 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666389 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666392 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666395 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666398 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666400 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666403 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666406 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666408 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666411 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666413 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666416 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666418 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666421 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666424 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666426 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666429 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666431 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666434 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666436 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:59.669835 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:09:59.666439 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:59.670359 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.666443 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:59.670359 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.667226 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:09:59.670359 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.669537 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:09:59.670533 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.670521 2576 server.go:1019] "Starting client certificate rotation" Apr 16 18:09:59.670638 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.670621 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:09:59.670671 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.670659 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:09:59.695961 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.695940 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:09:59.697945 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.697912 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:09:59.713153 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.713132 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:09:59.718498 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.718483 2576 log.go:25] "Validated CRI v1 image API" Apr 16 18:09:59.719757 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.719731 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:09:59.724095 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.724064 2576 fs.go:135] Filesystem UUIDs: map[23b374de-9cd3-420b-9738-56e2fe50c248:/dev/nvme0n1p4 511075e6-34ce-4f18-a5dd-4e1d16cec459:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 18:09:59.724172 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.724093 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:09:59.726954 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.726936 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:09:59.730405 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.730298 2576 manager.go:217] Machine: {Timestamp:2026-04-16 18:09:59.728362668 +0000 UTC m=+0.431694182 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101101 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d8eb0374446e6a44177cc23074e84 SystemUUID:ec2d8eb0-3744-46e6-a441-77cc23074e84 BootID:629ea8f5-faf2-4dc3-b5ef-9752e422149c Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:43:fc:65:77:65 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:43:fc:65:77:65 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:16:2d:10:59:73:88 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:09:59.730405 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.730398 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:09:59.730516 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.730481 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:09:59.731414 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.731391 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:09:59.731549 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.731416 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-51.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:09:59.731591 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.731558 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:09:59.731591 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.731567 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:09:59.731591 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.731580 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:09:59.732382 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.732371 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:09:59.733834 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.733824 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:09:59.733934 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.733925 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:09:59.736423 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.736413 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:09:59.736464 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.736432 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:09:59.736464 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.736444 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:09:59.736464 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.736454 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:09:59.736464 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.736462 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:09:59.737500 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.737489 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:09:59.737544 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.737507 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:09:59.740441 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.740426 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:09:59.742200 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.742187 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:09:59.743659 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.743648 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:09:59.743698 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.743664 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:09:59.743698 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.743670 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:09:59.743698 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.743676 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:09:59.743698 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.743681 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:09:59.743698 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.743687 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:09:59.743698 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.743692 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:09:59.743698 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.743697 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:09:59.743883 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.743704 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:09:59.743883 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.743717 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:09:59.743883 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.743729 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:09:59.743883 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.743738 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:09:59.744616 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.744604 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:09:59.744616 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.744615 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:09:59.748076 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.748063 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:09:59.748157 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.748106 2576 server.go:1295] "Started kubelet" Apr 16 18:09:59.748281 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.748212 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:09:59.748342 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.748288 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:09:59.748342 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.748340 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:09:59.749497 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.749478 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:09:59.749582 ip-10-0-143-51 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:09:59.750169 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.750098 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-51.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:09:59.750279 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.750240 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xcwzq" Apr 16 18:09:59.751092 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.751076 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:09:59.751256 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:09:59.751118 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-51.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:09:59.751256 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:09:59.751132 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:09:59.755050 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.755034 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:09:59.755136 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.755038 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:09:59.757904 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:09:59.754406 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-51.ec2.internal.18a6e8bc98219c82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-51.ec2.internal,UID:ip-10-0-143-51.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-51.ec2.internal,},FirstTimestamp:2026-04-16 18:09:59.748074626 +0000 UTC m=+0.451406141,LastTimestamp:2026-04-16 18:09:59.748074626 +0000 UTC m=+0.451406141,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-51.ec2.internal,}" Apr 16 18:09:59.758044 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.758028 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xcwzq" Apr 16 18:09:59.758538 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.758517 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:09:59.759477 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.759462 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:09:59.759592 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.759573 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:09:59.759763 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.759751 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:09:59.759763 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.759762 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:09:59.760194 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.760177 2576 factory.go:153] Registering CRI-O factory Apr 16 18:09:59.760297 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.760199 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 18:09:59.760297 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.760278 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:09:59.760297 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.760288 2576 factory.go:55] Registering systemd factory Apr 16 18:09:59.760297 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.760296 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:09:59.760477 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.760319 2576 factory.go:103] Registering Raw factory Apr 16 18:09:59.760477 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.760331 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 18:09:59.760920 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:09:59.760893 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-51.ec2.internal\" not found" Apr 16 18:09:59.761727 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.761697 2576 manager.go:319] Starting recovery of all containers Apr 16 18:09:59.763281 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:09:59.763259 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:09:59.766496 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.766476 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:59.768675 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:09:59.768650 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-51.ec2.internal\" not found" node="ip-10-0-143-51.ec2.internal" Apr 16 18:09:59.772523 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.772407 2576 manager.go:324] Recovery completed Apr 16 18:09:59.776439 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.776430 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:59.780591 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.780574 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:59.780677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.780609 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:59.780677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.780624 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:59.781126 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.781113 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:09:59.781164 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.781127 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:09:59.781164 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.781145 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:09:59.783762 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.783747 2576 policy_none.go:49] "None policy: Start" Apr 16 18:09:59.783762 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.783765 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:09:59.783862 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.783775 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:09:59.819931 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.819913 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 18:09:59.834366 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:09:59.819948 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:09:59.834366 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.819958 2576 server.go:85] "Starting device plugin registration server" Apr 16 18:09:59.834366 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.820216 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:09:59.834366 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.820231 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:09:59.834366 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.820369 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:09:59.834366 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.820445 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:09:59.834366 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.820453 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:09:59.834366 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:09:59.821096 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:09:59.834366 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:09:59.821138 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-51.ec2.internal\" not found" Apr 16 18:09:59.892712 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.892676 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:09:59.893891 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.893867 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:09:59.893891 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.893896 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:09:59.894057 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.893916 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:09:59.894057 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.893922 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:09:59.894057 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:09:59.893964 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:09:59.896553 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.896507 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:59.922213 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.922188 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:59.923082 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.923067 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:59.923155 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.923096 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:59.923155 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.923107 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:59.923155 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.923128 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-51.ec2.internal" Apr 16 18:09:59.931014 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.930997 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-51.ec2.internal" Apr 16 18:09:59.931063 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:09:59.931020 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-51.ec2.internal\": node \"ip-10-0-143-51.ec2.internal\" not found" Apr 16 18:09:59.958102 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:09:59.958082 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-51.ec2.internal\" not found" Apr 16 18:09:59.994338 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.994309 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-51.ec2.internal"] Apr 16 18:09:59.994405 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.994398 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:59.995285 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.995271 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:59.995340 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.995300 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:59.995340 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.995309 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:59.996563 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.996550 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:59.996678 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.996663 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" Apr 16 18:09:59.996712 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.996697 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:59.997271 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.997257 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:59.997271 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.997267 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:59.997381 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.997287 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:59.997381 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.997288 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:59.997381 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.997297 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:59.997381 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.997302 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:59.999570 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.999557 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-51.ec2.internal" Apr 16 18:09:59.999632 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:09:59.999579 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:00.000585 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.000568 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:00.000647 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.000601 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:00.000647 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.000617 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:00.025134 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:00.025105 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-51.ec2.internal\" not found" node="ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.029587 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:00.029571 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-51.ec2.internal\" not found" node="ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.058705 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:00.058666 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-51.ec2.internal\" not found" Apr 16 18:10:00.060902 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.060879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9a7b007f7f10e6a8374521e6218a633f-config\") pod \"kube-apiserver-proxy-ip-10-0-143-51.ec2.internal\" (UID: \"9a7b007f7f10e6a8374521e6218a633f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.060987 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.060918 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9de07c5d9aedf251a72fc754467481ac-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal\" (UID: \"9de07c5d9aedf251a72fc754467481ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.060987 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.060947 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9de07c5d9aedf251a72fc754467481ac-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal\" (UID: \"9de07c5d9aedf251a72fc754467481ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.159625 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:00.159563 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-51.ec2.internal\" not found" Apr 16 18:10:00.161805 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.161785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9de07c5d9aedf251a72fc754467481ac-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal\" (UID: \"9de07c5d9aedf251a72fc754467481ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.161852 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.161821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9de07c5d9aedf251a72fc754467481ac-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal\" (UID: \"9de07c5d9aedf251a72fc754467481ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.161852 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.161842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9a7b007f7f10e6a8374521e6218a633f-config\") pod \"kube-apiserver-proxy-ip-10-0-143-51.ec2.internal\" (UID: \"9a7b007f7f10e6a8374521e6218a633f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.161927 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.161896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9a7b007f7f10e6a8374521e6218a633f-config\") pod \"kube-apiserver-proxy-ip-10-0-143-51.ec2.internal\" (UID: \"9a7b007f7f10e6a8374521e6218a633f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.161927 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.161901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9de07c5d9aedf251a72fc754467481ac-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal\" (UID: \"9de07c5d9aedf251a72fc754467481ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.161987 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.161901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9de07c5d9aedf251a72fc754467481ac-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal\" (UID: \"9de07c5d9aedf251a72fc754467481ac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.259676 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:00.259624 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-51.ec2.internal\" not found" Apr 16 18:10:00.328997 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.328970 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.332672 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.332655 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.359915 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:00.359885 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-51.ec2.internal\" not found" Apr 16 18:10:00.460466 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:00.460369 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-51.ec2.internal\" not found" Apr 16 18:10:00.560889 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:00.560845 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-51.ec2.internal\" not found" Apr 16 18:10:00.644540 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.644511 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:00.661347 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:00.661309 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-51.ec2.internal\" not found" Apr 16 18:10:00.670616 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.670595 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:10:00.670739 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.670724 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:10:00.670781 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.670763 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:10:00.670781 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.670764 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:10:00.756007 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.755934 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:10:00.761754 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:00.761731 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-51.ec2.internal\" not found" Apr 16 18:10:00.762954 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.762930 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:04:59 +0000 UTC" deadline="2027-11-02 21:17:11.037077208 +0000 UTC" Apr 16 18:10:00.763019 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.762954 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13563h7m10.274126337s" Apr 16 18:10:00.765552 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.765534 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:10:00.773603 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.773582 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:00.804779 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.804758 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5mmvq" Apr 16 18:10:00.812031 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.812015 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5mmvq" Apr 16 18:10:00.858178 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.858156 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.869175 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.869157 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:10:00.870232 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.870218 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-51.ec2.internal" Apr 16 18:10:00.879655 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.879635 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:10:00.993523 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:00.993485 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a7b007f7f10e6a8374521e6218a633f.slice/crio-547a03d3263377ae9357ef5e4496bc1271f365c3eaa980bf5fbdfa3eef97292b WatchSource:0}: Error finding container 547a03d3263377ae9357ef5e4496bc1271f365c3eaa980bf5fbdfa3eef97292b: Status 404 returned error can't find the container with id 547a03d3263377ae9357ef5e4496bc1271f365c3eaa980bf5fbdfa3eef97292b Apr 16 18:10:00.993941 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:00.993919 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de07c5d9aedf251a72fc754467481ac.slice/crio-c43de99bdc0493602180060ef4f7be34ee1f765691bb0038935e7cf97392c787 WatchSource:0}: Error finding container c43de99bdc0493602180060ef4f7be34ee1f765691bb0038935e7cf97392c787: Status 404 returned error can't find the container with id c43de99bdc0493602180060ef4f7be34ee1f765691bb0038935e7cf97392c787 Apr 16 18:10:00.998352 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:00.998331 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:10:01.737455 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.737421 2576 apiserver.go:52] "Watching apiserver" Apr 16 18:10:01.745055 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.745026 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:10:01.745459 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.745432 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-qhbm6","openshift-ovn-kubernetes/ovnkube-node-zkbc8","kube-system/konnectivity-agent-lsx75","kube-system/kube-apiserver-proxy-ip-10-0-143-51.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal","openshift-multus/multus-additional-cni-plugins-fkzwr","openshift-multus/network-metrics-daemon-687m2","openshift-network-operator/iptables-alerter-snh2z","kube-system/global-pull-secret-syncer-slcjb","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz","openshift-cluster-node-tuning-operator/tuned-xrwkr","openshift-dns/node-resolver-tqcl9","openshift-image-registry/node-ca-xcxgk","openshift-multus/multus-d24gh"] Apr 16 18:10:01.748082 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.748054 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-snh2z" Apr 16 18:10:01.748377 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.748304 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xcxgk" Apr 16 18:10:01.749369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.749347 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lsx75" Apr 16 18:10:01.750439 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.750180 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.750439 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.750223 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-rqkcd\"" Apr 16 18:10:01.750439 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.750187 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:10:01.750439 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.750339 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.750439 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.750347 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.750711 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.750505 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.750711 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.750505 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:10:01.750955 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.750935 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2b5w5\"" Apr 16 18:10:01.751364 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.751346 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dtdqr\"" Apr 16 18:10:01.751975 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.751475 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:10:01.751975 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.751638 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:10:01.754505 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.754178 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tqcl9" Apr 16 18:10:01.755689 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.755670 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.756031 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.755794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.756959 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.756934 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:01.757129 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:01.757048 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:01.757233 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.757184 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.757437 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.757422 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-j5k87\"" Apr 16 18:10:01.757514 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.757466 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.758457 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.758351 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:01.758457 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:01.758424 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:01.759665 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.759646 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:01.759746 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:01.759723 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:01.761866 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.761846 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.762135 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.762117 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.762436 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.762360 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:10:01.762595 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.762575 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.762697 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.762600 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.762697 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.762676 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-g6dg6\"" Apr 16 18:10:01.762831 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.762813 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vwqxk\"" Apr 16 18:10:01.762923 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.762842 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:10:01.762984 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.762963 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.763083 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.763062 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:10:01.765397 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.764560 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.765397 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.764893 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.765397 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.765277 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-462fd\"" Apr 16 18:10:01.765397 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.765294 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:10:01.765603 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.765560 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.765888 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.765867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.766647 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.766627 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.766865 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.766813 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-fs5s4\"" Apr 16 18:10:01.767201 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.767157 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:10:01.767292 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.767210 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:10:01.767380 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.767353 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:10:01.767524 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.767501 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:10:01.767634 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.767516 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.767945 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.767829 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-x4h8x\"" Apr 16 18:10:01.767945 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.767838 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:10:01.768955 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.768938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-sysctl-conf\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.769060 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.768960 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-systemd\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.769060 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.768982 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-tmp\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.769060 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc78l\" (UniqueName: \"kubernetes.io/projected/abb881c2-5bd6-4f73-a490-2665c9449ae7-kube-api-access-wc78l\") pod \"network-metrics-daemon-687m2\" (UID: \"abb881c2-5bd6-4f73-a490-2665c9449ae7\") " pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:01.769060 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769036 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:01.769060 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7a7a48be-c9f2-4cfa-8741-c0bfba5190ba-agent-certs\") pod \"konnectivity-agent-lsx75\" (UID: \"7a7a48be-c9f2-4cfa-8741-c0bfba5190ba\") " pod="kube-system/konnectivity-agent-lsx75" Apr 16 18:10:01.769330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769068 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzjxs\" (UniqueName: \"kubernetes.io/projected/62ab651d-790e-48bc-91c8-e40eada59965-kube-api-access-zzjxs\") pod \"node-resolver-tqcl9\" (UID: \"62ab651d-790e-48bc-91c8-e40eada59965\") " pod="openshift-dns/node-resolver-tqcl9" Apr 16 18:10:01.769330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b097be47-4dfd-4fcd-a8ab-78a2cd491538-kubelet-config\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:01.769330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8670fc01-fa98-49d3-8a70-5fe409cb46a1-cnibin\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.769330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8670fc01-fa98-49d3-8a70-5fe409cb46a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.769330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-sysctl-d\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.769330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-var-lib-kubelet\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.769330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-host\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.769330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769208 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksqz7\" (UniqueName: \"kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7\") pod \"network-check-target-qhbm6\" (UID: \"795df744-5070-4941-a2b6-f01fc85241b9\") " pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:01.769330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b097be47-4dfd-4fcd-a8ab-78a2cd491538-dbus\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:01.769330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/969cd886-bdfc-46ca-ab57-08cca0abed0b-serviceca\") pod \"node-ca-xcxgk\" (UID: \"969cd886-bdfc-46ca-ab57-08cca0abed0b\") " pod="openshift-image-registry/node-ca-xcxgk" Apr 16 18:10:01.769330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8670fc01-fa98-49d3-8a70-5fe409cb46a1-system-cni-dir\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.769330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-sysconfig\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.769792 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769345 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-run\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.769847 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/32938f07-abc4-4f36-9ffc-6472e5b05222-iptables-alerter-script\") pod \"iptables-alerter-snh2z\" (UID: \"32938f07-abc4-4f36-9ffc-6472e5b05222\") " pod="openshift-network-operator/iptables-alerter-snh2z" Apr 16 18:10:01.769847 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769838 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32938f07-abc4-4f36-9ffc-6472e5b05222-host-slash\") pod \"iptables-alerter-snh2z\" (UID: \"32938f07-abc4-4f36-9ffc-6472e5b05222\") " pod="openshift-network-operator/iptables-alerter-snh2z" Apr 16 18:10:01.769950 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769854 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8670fc01-fa98-49d3-8a70-5fe409cb46a1-os-release\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.769950 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j75ph\" (UniqueName: \"kubernetes.io/projected/8670fc01-fa98-49d3-8a70-5fe409cb46a1-kube-api-access-j75ph\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.769950 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7a7a48be-c9f2-4cfa-8741-c0bfba5190ba-konnectivity-ca\") pod \"konnectivity-agent-lsx75\" (UID: \"7a7a48be-c9f2-4cfa-8741-c0bfba5190ba\") " pod="kube-system/konnectivity-agent-lsx75" Apr 16 18:10:01.769950 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs\") pod \"network-metrics-daemon-687m2\" (UID: \"abb881c2-5bd6-4f73-a490-2665c9449ae7\") " pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:01.770105 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.769989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8670fc01-fa98-49d3-8a70-5fe409cb46a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.770105 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.770036 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/62ab651d-790e-48bc-91c8-e40eada59965-tmp-dir\") pod \"node-resolver-tqcl9\" (UID: \"62ab651d-790e-48bc-91c8-e40eada59965\") " pod="openshift-dns/node-resolver-tqcl9" Apr 16 18:10:01.770105 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.770067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-lib-modules\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.770222 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.770107 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-tuned\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.770222 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.770139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmzps\" (UniqueName: \"kubernetes.io/projected/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-kube-api-access-zmzps\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.770222 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.770172 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b285f\" (UniqueName: \"kubernetes.io/projected/32938f07-abc4-4f36-9ffc-6472e5b05222-kube-api-access-b285f\") pod \"iptables-alerter-snh2z\" (UID: \"32938f07-abc4-4f36-9ffc-6472e5b05222\") " pod="openshift-network-operator/iptables-alerter-snh2z" Apr 16 18:10:01.770385 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.770232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/969cd886-bdfc-46ca-ab57-08cca0abed0b-host\") pod \"node-ca-xcxgk\" (UID: \"969cd886-bdfc-46ca-ab57-08cca0abed0b\") " pod="openshift-image-registry/node-ca-xcxgk" Apr 16 18:10:01.770385 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.770289 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8670fc01-fa98-49d3-8a70-5fe409cb46a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.770385 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.770335 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8670fc01-fa98-49d3-8a70-5fe409cb46a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.770385 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.770361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-kubernetes\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.770528 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.770399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-sys\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.770528 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.770421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x4p8\" (UniqueName: \"kubernetes.io/projected/969cd886-bdfc-46ca-ab57-08cca0abed0b-kube-api-access-9x4p8\") pod \"node-ca-xcxgk\" (UID: \"969cd886-bdfc-46ca-ab57-08cca0abed0b\") " pod="openshift-image-registry/node-ca-xcxgk" Apr 16 18:10:01.770528 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.770442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/62ab651d-790e-48bc-91c8-e40eada59965-hosts-file\") pod \"node-resolver-tqcl9\" (UID: \"62ab651d-790e-48bc-91c8-e40eada59965\") " pod="openshift-dns/node-resolver-tqcl9" Apr 16 18:10:01.770528 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.770468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-modprobe-d\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.813942 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.813896 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:05:00 +0000 UTC" deadline="2027-12-20 19:38:53.71832692 +0000 UTC" Apr 16 18:10:01.813942 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.813928 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14713h28m51.904401921s" Apr 16 18:10:01.859518 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.859488 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:10:01.871196 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/62ab651d-790e-48bc-91c8-e40eada59965-hosts-file\") pod \"node-resolver-tqcl9\" (UID: \"62ab651d-790e-48bc-91c8-e40eada59965\") " pod="openshift-dns/node-resolver-tqcl9" Apr 16 18:10:01.871349 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-systemd\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.871349 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871234 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-etc-selinux\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.871349 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-cni-netd\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.871349 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-systemd\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.871349 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/62ab651d-790e-48bc-91c8-e40eada59965-hosts-file\") pod \"node-resolver-tqcl9\" (UID: \"62ab651d-790e-48bc-91c8-e40eada59965\") " pod="openshift-dns/node-resolver-tqcl9" Apr 16 18:10:01.871596 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871349 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ced49c9c-a339-4f7e-9970-0aabc7b2d765-ovn-node-metrics-cert\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.871596 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-run-k8s-cni-cncf-io\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.871596 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzjxs\" (UniqueName: \"kubernetes.io/projected/62ab651d-790e-48bc-91c8-e40eada59965-kube-api-access-zzjxs\") pod \"node-resolver-tqcl9\" (UID: \"62ab651d-790e-48bc-91c8-e40eada59965\") " pod="openshift-dns/node-resolver-tqcl9" Apr 16 18:10:01.871596 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ced49c9c-a339-4f7e-9970-0aabc7b2d765-ovnkube-script-lib\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.871596 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07c55bf2-6978-4a9f-ace9-a376fd86df33-cni-binary-copy\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.871596 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b097be47-4dfd-4fcd-a8ab-78a2cd491538-kubelet-config\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:01.871821 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b097be47-4dfd-4fcd-a8ab-78a2cd491538-kubelet-config\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:01.871821 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-run-systemd\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.871821 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8670fc01-fa98-49d3-8a70-5fe409cb46a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.871821 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-var-lib-kubelet\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.871821 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-host\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.871821 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-os-release\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.871821 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b097be47-4dfd-4fcd-a8ab-78a2cd491538-dbus\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:01.871821 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/969cd886-bdfc-46ca-ab57-08cca0abed0b-serviceca\") pod \"node-ca-xcxgk\" (UID: \"969cd886-bdfc-46ca-ab57-08cca0abed0b\") " pod="openshift-image-registry/node-ca-xcxgk" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871839 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-var-lib-kubelet\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ced49c9c-a339-4f7e-9970-0aabc7b2d765-ovnkube-config\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ced49c9c-a339-4f7e-9970-0aabc7b2d765-env-overrides\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871941 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-multus-cni-dir\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-run-netns\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-multus-conf-dir\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.871995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b097be47-4dfd-4fcd-a8ab-78a2cd491538-dbus\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32938f07-abc4-4f36-9ffc-6472e5b05222-host-slash\") pod \"iptables-alerter-snh2z\" (UID: \"32938f07-abc4-4f36-9ffc-6472e5b05222\") " pod="openshift-network-operator/iptables-alerter-snh2z" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8670fc01-fa98-49d3-8a70-5fe409cb46a1-os-release\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872081 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7a7a48be-c9f2-4cfa-8741-c0bfba5190ba-konnectivity-ca\") pod \"konnectivity-agent-lsx75\" (UID: \"7a7a48be-c9f2-4cfa-8741-c0bfba5190ba\") " pod="kube-system/konnectivity-agent-lsx75" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-host\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-lib-modules\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32938f07-abc4-4f36-9ffc-6472e5b05222-host-slash\") pod \"iptables-alerter-snh2z\" (UID: \"32938f07-abc4-4f36-9ffc-6472e5b05222\") " pod="openshift-network-operator/iptables-alerter-snh2z" Apr 16 18:10:01.872186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-tuned\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmzps\" (UniqueName: \"kubernetes.io/projected/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-kube-api-access-zmzps\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8670fc01-fa98-49d3-8a70-5fe409cb46a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872318 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8670fc01-fa98-49d3-8a70-5fe409cb46a1-os-release\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872321 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-lib-modules\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-kubelet\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-cni-bin\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/07c55bf2-6978-4a9f-ace9-a376fd86df33-multus-daemon-config\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-etc-kubernetes\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/969cd886-bdfc-46ca-ab57-08cca0abed0b-host\") pod \"node-ca-xcxgk\" (UID: \"969cd886-bdfc-46ca-ab57-08cca0abed0b\") " pod="openshift-image-registry/node-ca-xcxgk" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8670fc01-fa98-49d3-8a70-5fe409cb46a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-kubernetes\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-sys\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872562 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-var-lib-cni-multus\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872610 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-kubernetes\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.872850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/969cd886-bdfc-46ca-ab57-08cca0abed0b-serviceca\") pod \"node-ca-xcxgk\" (UID: \"969cd886-bdfc-46ca-ab57-08cca0abed0b\") " pod="openshift-image-registry/node-ca-xcxgk" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872636 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-sys\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/969cd886-bdfc-46ca-ab57-08cca0abed0b-host\") pod \"node-ca-xcxgk\" (UID: \"969cd886-bdfc-46ca-ab57-08cca0abed0b\") " pod="openshift-image-registry/node-ca-xcxgk" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872607 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x4p8\" (UniqueName: \"kubernetes.io/projected/969cd886-bdfc-46ca-ab57-08cca0abed0b-kube-api-access-9x4p8\") pod \"node-ca-xcxgk\" (UID: \"969cd886-bdfc-46ca-ab57-08cca0abed0b\") " pod="openshift-image-registry/node-ca-xcxgk" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-modprobe-d\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7a7a48be-c9f2-4cfa-8741-c0bfba5190ba-konnectivity-ca\") pod \"konnectivity-agent-lsx75\" (UID: \"7a7a48be-c9f2-4cfa-8741-c0bfba5190ba\") " pod="kube-system/konnectivity-agent-lsx75" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-sysctl-conf\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-tmp\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc78l\" (UniqueName: \"kubernetes.io/projected/abb881c2-5bd6-4f73-a490-2665c9449ae7-kube-api-access-wc78l\") pod \"network-metrics-daemon-687m2\" (UID: \"abb881c2-5bd6-4f73-a490-2665c9449ae7\") " pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872805 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-sysctl-conf\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872818 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-registration-dir\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872849 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-var-lib-openvswitch\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872875 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-etc-openvswitch\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-modprobe-d\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7a7a48be-c9f2-4cfa-8741-c0bfba5190ba-agent-certs\") pod \"konnectivity-agent-lsx75\" (UID: \"7a7a48be-c9f2-4cfa-8741-c0bfba5190ba\") " pod="kube-system/konnectivity-agent-lsx75" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-sys-fs\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.873623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.872991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-systemd-units\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873005 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8670fc01-fa98-49d3-8a70-5fe409cb46a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkpq6\" (UniqueName: \"kubernetes.io/projected/ced49c9c-a339-4f7e-9970-0aabc7b2d765-kube-api-access-vkpq6\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873049 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-cnibin\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873076 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-var-lib-kubelet\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:01.873082 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8670fc01-fa98-49d3-8a70-5fe409cb46a1-cnibin\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8670fc01-fa98-49d3-8a70-5fe409cb46a1-cnibin\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:01.873171 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret podName:b097be47-4dfd-4fcd-a8ab-78a2cd491538 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:02.37312317 +0000 UTC m=+3.076454686 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret") pod "global-pull-secret-syncer-slcjb" (UID: "b097be47-4dfd-4fcd-a8ab-78a2cd491538") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-socket-dir\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-device-dir\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873258 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-run-netns\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-run-ovn-kubernetes\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-var-lib-cni-bin\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-sysctl-d\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873367 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqz7\" (UniqueName: \"kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7\") pod \"network-check-target-qhbm6\" (UID: \"795df744-5070-4941-a2b6-f01fc85241b9\") " pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:01.874424 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-slash\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-run-openvswitch\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-sysctl-d\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873463 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-system-cni-dir\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-multus-socket-dir-parent\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-hostroot\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-run-multus-certs\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8670fc01-fa98-49d3-8a70-5fe409cb46a1-system-cni-dir\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-sysconfig\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-run\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq9mr\" (UniqueName: \"kubernetes.io/projected/ea8057f2-eb32-4731-8eae-88959bceb760-kube-api-access-qq9mr\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8670fc01-fa98-49d3-8a70-5fe409cb46a1-system-cni-dir\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-sysconfig\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873669 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-run-ovn\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873708 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-run\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82fk5\" (UniqueName: \"kubernetes.io/projected/07c55bf2-6978-4a9f-ace9-a376fd86df33-kube-api-access-82fk5\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/32938f07-abc4-4f36-9ffc-6472e5b05222-iptables-alerter-script\") pod \"iptables-alerter-snh2z\" (UID: \"32938f07-abc4-4f36-9ffc-6472e5b05222\") " pod="openshift-network-operator/iptables-alerter-snh2z" Apr 16 18:10:01.875209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j75ph\" (UniqueName: \"kubernetes.io/projected/8670fc01-fa98-49d3-8a70-5fe409cb46a1-kube-api-access-j75ph\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.876030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs\") pod \"network-metrics-daemon-687m2\" (UID: \"abb881c2-5bd6-4f73-a490-2665c9449ae7\") " pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:01.876030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-log-socket\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.876030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873874 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8670fc01-fa98-49d3-8a70-5fe409cb46a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.876030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/62ab651d-790e-48bc-91c8-e40eada59965-tmp-dir\") pod \"node-resolver-tqcl9\" (UID: \"62ab651d-790e-48bc-91c8-e40eada59965\") " pod="openshift-dns/node-resolver-tqcl9" Apr 16 18:10:01.876030 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:01.873927 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:01.876030 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:01.873979 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs podName:abb881c2-5bd6-4f73-a490-2665c9449ae7 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:02.373962318 +0000 UTC m=+3.077293835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs") pod "network-metrics-daemon-687m2" (UID: "abb881c2-5bd6-4f73-a490-2665c9449ae7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:01.876030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.873931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b285f\" (UniqueName: \"kubernetes.io/projected/32938f07-abc4-4f36-9ffc-6472e5b05222-kube-api-access-b285f\") pod \"iptables-alerter-snh2z\" (UID: \"32938f07-abc4-4f36-9ffc-6472e5b05222\") " pod="openshift-network-operator/iptables-alerter-snh2z" Apr 16 18:10:01.876030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.874019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8670fc01-fa98-49d3-8a70-5fe409cb46a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.876030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.874053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-node-log\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.876030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.874318 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/32938f07-abc4-4f36-9ffc-6472e5b05222-iptables-alerter-script\") pod \"iptables-alerter-snh2z\" (UID: \"32938f07-abc4-4f36-9ffc-6472e5b05222\") " pod="openshift-network-operator/iptables-alerter-snh2z" Apr 16 18:10:01.876030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.874422 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/62ab651d-790e-48bc-91c8-e40eada59965-tmp-dir\") pod \"node-resolver-tqcl9\" (UID: \"62ab651d-790e-48bc-91c8-e40eada59965\") " pod="openshift-dns/node-resolver-tqcl9" Apr 16 18:10:01.876030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.874534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8670fc01-fa98-49d3-8a70-5fe409cb46a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.876030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.874881 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8670fc01-fa98-49d3-8a70-5fe409cb46a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.876692 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.876497 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-tmp\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.876692 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.876550 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-etc-tuned\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.876853 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.876830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7a7a48be-c9f2-4cfa-8741-c0bfba5190ba-agent-certs\") pod \"konnectivity-agent-lsx75\" (UID: \"7a7a48be-c9f2-4cfa-8741-c0bfba5190ba\") " pod="kube-system/konnectivity-agent-lsx75" Apr 16 18:10:01.879062 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.879038 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzjxs\" (UniqueName: \"kubernetes.io/projected/62ab651d-790e-48bc-91c8-e40eada59965-kube-api-access-zzjxs\") pod \"node-resolver-tqcl9\" (UID: \"62ab651d-790e-48bc-91c8-e40eada59965\") " pod="openshift-dns/node-resolver-tqcl9" Apr 16 18:10:01.879795 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.879750 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmzps\" (UniqueName: \"kubernetes.io/projected/fd0b27ea-1963-4d05-88c4-3dced0ebdf3a-kube-api-access-zmzps\") pod \"tuned-xrwkr\" (UID: \"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a\") " pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:01.880697 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.880596 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x4p8\" (UniqueName: \"kubernetes.io/projected/969cd886-bdfc-46ca-ab57-08cca0abed0b-kube-api-access-9x4p8\") pod \"node-ca-xcxgk\" (UID: \"969cd886-bdfc-46ca-ab57-08cca0abed0b\") " pod="openshift-image-registry/node-ca-xcxgk" Apr 16 18:10:01.886463 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.886436 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc78l\" (UniqueName: \"kubernetes.io/projected/abb881c2-5bd6-4f73-a490-2665c9449ae7-kube-api-access-wc78l\") pod \"network-metrics-daemon-687m2\" (UID: \"abb881c2-5bd6-4f73-a490-2665c9449ae7\") " pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:01.887428 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:01.887380 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:01.887428 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:01.887402 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:01.887428 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:01.887417 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ksqz7 for pod openshift-network-diagnostics/network-check-target-qhbm6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:01.887601 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:01.887484 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7 podName:795df744-5070-4941-a2b6-f01fc85241b9 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:02.387466568 +0000 UTC m=+3.090798087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ksqz7" (UniqueName: "kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7") pod "network-check-target-qhbm6" (UID: "795df744-5070-4941-a2b6-f01fc85241b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:01.892740 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.892716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j75ph\" (UniqueName: \"kubernetes.io/projected/8670fc01-fa98-49d3-8a70-5fe409cb46a1-kube-api-access-j75ph\") pod \"multus-additional-cni-plugins-fkzwr\" (UID: \"8670fc01-fa98-49d3-8a70-5fe409cb46a1\") " pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:01.892942 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.892921 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b285f\" (UniqueName: \"kubernetes.io/projected/32938f07-abc4-4f36-9ffc-6472e5b05222-kube-api-access-b285f\") pod \"iptables-alerter-snh2z\" (UID: \"32938f07-abc4-4f36-9ffc-6472e5b05222\") " pod="openshift-network-operator/iptables-alerter-snh2z" Apr 16 18:10:01.898837 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.898789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-51.ec2.internal" event={"ID":"9a7b007f7f10e6a8374521e6218a633f","Type":"ContainerStarted","Data":"547a03d3263377ae9357ef5e4496bc1271f365c3eaa980bf5fbdfa3eef97292b"} Apr 16 18:10:01.899934 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.899912 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" event={"ID":"9de07c5d9aedf251a72fc754467481ac","Type":"ContainerStarted","Data":"c43de99bdc0493602180060ef4f7be34ee1f765691bb0038935e7cf97392c787"} Apr 16 18:10:01.975015 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.974972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-run-systemd\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.975202 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-os-release\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.975202 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.975202 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ced49c9c-a339-4f7e-9970-0aabc7b2d765-ovnkube-config\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.975202 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975083 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-run-systemd\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.975202 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ced49c9c-a339-4f7e-9970-0aabc7b2d765-env-overrides\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.975202 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-multus-cni-dir\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.975202 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-run-netns\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975218 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-multus-conf-dir\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-os-release\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-multus-conf-dir\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975298 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-multus-cni-dir\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975298 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-run-netns\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975357 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-kubelet\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-cni-bin\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/07c55bf2-6978-4a9f-ace9-a376fd86df33-multus-daemon-config\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-etc-kubernetes\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975501 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-cni-bin\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-var-lib-cni-multus\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-registration-dir\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.975590 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-var-lib-openvswitch\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975607 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-etc-openvswitch\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975622 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ced49c9c-a339-4f7e-9970-0aabc7b2d765-env-overrides\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-sys-fs\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-systemd-units\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975680 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-etc-kubernetes\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkpq6\" (UniqueName: \"kubernetes.io/projected/ced49c9c-a339-4f7e-9970-0aabc7b2d765-kube-api-access-vkpq6\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-etc-openvswitch\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-cnibin\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-var-lib-kubelet\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-registration-dir\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975768 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ced49c9c-a339-4f7e-9970-0aabc7b2d765-ovnkube-config\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-socket-dir\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-device-dir\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-var-lib-cni-multus\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-run-netns\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-run-netns\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.976389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-run-ovn-kubernetes\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-var-lib-cni-bin\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975924 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-cnibin\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-slash\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-var-lib-kubelet\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-run-openvswitch\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975807 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-var-lib-openvswitch\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.975999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-system-cni-dir\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/07c55bf2-6978-4a9f-ace9-a376fd86df33-multus-daemon-config\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976025 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-multus-socket-dir-parent\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-kubelet\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-var-lib-cni-bin\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-sys-fs\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-hostroot\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-run-multus-certs\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-run-ovn-kubernetes\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-system-cni-dir\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-device-dir\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.977139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-slash\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq9mr\" (UniqueName: \"kubernetes.io/projected/ea8057f2-eb32-4731-8eae-88959bceb760-kube-api-access-qq9mr\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976128 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-multus-socket-dir-parent\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-run-openvswitch\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-systemd-units\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-hostroot\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-socket-dir\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976205 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-run-multus-certs\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976213 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-run-ovn\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82fk5\" (UniqueName: \"kubernetes.io/projected/07c55bf2-6978-4a9f-ace9-a376fd86df33-kube-api-access-82fk5\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-run-ovn\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976447 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-log-socket\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-log-socket\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-node-log\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976567 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-etc-selinux\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-cni-netd\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ced49c9c-a339-4f7e-9970-0aabc7b2d765-ovn-node-metrics-cert\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976641 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-node-log\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.978009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976643 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-run-k8s-cni-cncf-io\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.978840 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/07c55bf2-6978-4a9f-ace9-a376fd86df33-host-run-k8s-cni-cncf-io\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.978840 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ced49c9c-a339-4f7e-9970-0aabc7b2d765-ovnkube-script-lib\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.978840 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07c55bf2-6978-4a9f-ace9-a376fd86df33-cni-binary-copy\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.978840 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ea8057f2-eb32-4731-8eae-88959bceb760-etc-selinux\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:01.978840 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.976819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ced49c9c-a339-4f7e-9970-0aabc7b2d765-host-cni-netd\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.978840 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.977206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07c55bf2-6978-4a9f-ace9-a376fd86df33-cni-binary-copy\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.978840 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.977296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ced49c9c-a339-4f7e-9970-0aabc7b2d765-ovnkube-script-lib\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.979609 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.979558 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ced49c9c-a339-4f7e-9970-0aabc7b2d765-ovn-node-metrics-cert\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.985284 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.985221 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82fk5\" (UniqueName: \"kubernetes.io/projected/07c55bf2-6978-4a9f-ace9-a376fd86df33-kube-api-access-82fk5\") pod \"multus-d24gh\" (UID: \"07c55bf2-6978-4a9f-ace9-a376fd86df33\") " pod="openshift-multus/multus-d24gh" Apr 16 18:10:01.985566 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.985545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkpq6\" (UniqueName: \"kubernetes.io/projected/ced49c9c-a339-4f7e-9970-0aabc7b2d765-kube-api-access-vkpq6\") pod \"ovnkube-node-zkbc8\" (UID: \"ced49c9c-a339-4f7e-9970-0aabc7b2d765\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:01.987378 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:01.987354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq9mr\" (UniqueName: \"kubernetes.io/projected/ea8057f2-eb32-4731-8eae-88959bceb760-kube-api-access-qq9mr\") pod \"aws-ebs-csi-driver-node-5mjbz\" (UID: \"ea8057f2-eb32-4731-8eae-88959bceb760\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:02.062712 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.062622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-snh2z" Apr 16 18:10:02.069521 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.069495 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xcxgk" Apr 16 18:10:02.077847 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.077553 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lsx75" Apr 16 18:10:02.079297 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.079278 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:02.082984 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.082965 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tqcl9" Apr 16 18:10:02.085089 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.085069 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:02.090805 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.090785 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" Apr 16 18:10:02.098449 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.098422 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fkzwr" Apr 16 18:10:02.105049 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.105023 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" Apr 16 18:10:02.113828 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.113805 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:02.120602 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.120578 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d24gh" Apr 16 18:10:02.379802 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.379768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:02.379981 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.379837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs\") pod \"network-metrics-daemon-687m2\" (UID: \"abb881c2-5bd6-4f73-a490-2665c9449ae7\") " pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:02.379981 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:02.379927 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:02.379981 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:02.379943 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:02.380109 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:02.380006 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret podName:b097be47-4dfd-4fcd-a8ab-78a2cd491538 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:03.379986824 +0000 UTC m=+4.083318327 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret") pod "global-pull-secret-syncer-slcjb" (UID: "b097be47-4dfd-4fcd-a8ab-78a2cd491538") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:02.380109 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:02.380028 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs podName:abb881c2-5bd6-4f73-a490-2665c9449ae7 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:03.380019036 +0000 UTC m=+4.083350538 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs") pod "network-metrics-daemon-687m2" (UID: "abb881c2-5bd6-4f73-a490-2665c9449ae7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:02.480712 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.480677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqz7\" (UniqueName: \"kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7\") pod \"network-check-target-qhbm6\" (UID: \"795df744-5070-4941-a2b6-f01fc85241b9\") " pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:02.480942 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:02.480866 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:02.480942 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:02.480894 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:02.480942 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:02.480907 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ksqz7 for pod openshift-network-diagnostics/network-check-target-qhbm6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:02.481072 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:02.480978 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7 podName:795df744-5070-4941-a2b6-f01fc85241b9 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:03.480957819 +0000 UTC m=+4.184289334 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ksqz7" (UniqueName: "kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7") pod "network-check-target-qhbm6" (UID: "795df744-5070-4941-a2b6-f01fc85241b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:02.747541 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.747450 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:02.814743 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.814697 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:05:00 +0000 UTC" deadline="2028-01-07 04:29:51.156615892 +0000 UTC" Apr 16 18:10:02.814743 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.814735 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15130h19m48.341884194s" Apr 16 18:10:02.864645 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:02.864615 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podced49c9c_a339_4f7e_9970_0aabc7b2d765.slice/crio-b5a7702326fbfda2e92a45ae6ebe09097c783190bf9a0409827da54f29313406 WatchSource:0}: Error finding container b5a7702326fbfda2e92a45ae6ebe09097c783190bf9a0409827da54f29313406: Status 404 returned error can't find the container with id b5a7702326fbfda2e92a45ae6ebe09097c783190bf9a0409827da54f29313406 Apr 16 18:10:02.865207 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:02.865180 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod969cd886_bdfc_46ca_ab57_08cca0abed0b.slice/crio-3e39381ab8583963bc10193717afceaca617a3715f13ed8466dd5d52de918283 WatchSource:0}: Error finding container 3e39381ab8583963bc10193717afceaca617a3715f13ed8466dd5d52de918283: Status 404 returned error can't find the container with id 3e39381ab8583963bc10193717afceaca617a3715f13ed8466dd5d52de918283 Apr 16 18:10:02.868070 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:02.867937 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a7a48be_c9f2_4cfa_8741_c0bfba5190ba.slice/crio-5bfee6824eab4ef470867dfbc7854b9fcaa4b664ed32d562d6da1d1e27207d2d WatchSource:0}: Error finding container 5bfee6824eab4ef470867dfbc7854b9fcaa4b664ed32d562d6da1d1e27207d2d: Status 404 returned error can't find the container with id 5bfee6824eab4ef470867dfbc7854b9fcaa4b664ed32d562d6da1d1e27207d2d Apr 16 18:10:02.870717 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:02.870682 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea8057f2_eb32_4731_8eae_88959bceb760.slice/crio-82e15fffe894ca113243a310321fb77ea97be98041ebc3587cfa56da97af10bd WatchSource:0}: Error finding container 82e15fffe894ca113243a310321fb77ea97be98041ebc3587cfa56da97af10bd: Status 404 returned error can't find the container with id 82e15fffe894ca113243a310321fb77ea97be98041ebc3587cfa56da97af10bd Apr 16 18:10:02.871394 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:02.871276 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8670fc01_fa98_49d3_8a70_5fe409cb46a1.slice/crio-885a2e326acd9dcfe61319d7658f48445019e787eec3934cdab8f2c65dc9d4ad WatchSource:0}: Error finding container 885a2e326acd9dcfe61319d7658f48445019e787eec3934cdab8f2c65dc9d4ad: Status 404 returned error can't find the container with id 885a2e326acd9dcfe61319d7658f48445019e787eec3934cdab8f2c65dc9d4ad Apr 16 18:10:02.873169 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:02.872459 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd0b27ea_1963_4d05_88c4_3dced0ebdf3a.slice/crio-f5a438799ea694281e0961fa1b50e55c5c6dc180de404937a6a98d693424ac1f WatchSource:0}: Error finding container f5a438799ea694281e0961fa1b50e55c5c6dc180de404937a6a98d693424ac1f: Status 404 returned error can't find the container with id f5a438799ea694281e0961fa1b50e55c5c6dc180de404937a6a98d693424ac1f Apr 16 18:10:02.874388 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:02.874365 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07c55bf2_6978_4a9f_ace9_a376fd86df33.slice/crio-19a383c18ec2b4ef776ce2d799b440855eb9c52b796d70b8e6b374f520c651e9 WatchSource:0}: Error finding container 19a383c18ec2b4ef776ce2d799b440855eb9c52b796d70b8e6b374f520c651e9: Status 404 returned error can't find the container with id 19a383c18ec2b4ef776ce2d799b440855eb9c52b796d70b8e6b374f520c651e9 Apr 16 18:10:02.877561 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:02.877534 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32938f07_abc4_4f36_9ffc_6472e5b05222.slice/crio-d0f99f62523d78b3631845da66d029c811314428bb516ebeaf5dc422120247ec WatchSource:0}: Error finding container d0f99f62523d78b3631845da66d029c811314428bb516ebeaf5dc422120247ec: Status 404 returned error can't find the container with id d0f99f62523d78b3631845da66d029c811314428bb516ebeaf5dc422120247ec Apr 16 18:10:02.878573 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:02.878541 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62ab651d_790e_48bc_91c8_e40eada59965.slice/crio-446b9cf42a4d39887e968b905f7e97b9edd7f145540e718b3e88da40077a8095 WatchSource:0}: Error finding container 446b9cf42a4d39887e968b905f7e97b9edd7f145540e718b3e88da40077a8095: Status 404 returned error can't find the container with id 446b9cf42a4d39887e968b905f7e97b9edd7f145540e718b3e88da40077a8095 Apr 16 18:10:02.902897 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.902865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" event={"ID":"ced49c9c-a339-4f7e-9970-0aabc7b2d765","Type":"ContainerStarted","Data":"b5a7702326fbfda2e92a45ae6ebe09097c783190bf9a0409827da54f29313406"} Apr 16 18:10:02.903861 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.903829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tqcl9" event={"ID":"62ab651d-790e-48bc-91c8-e40eada59965","Type":"ContainerStarted","Data":"446b9cf42a4d39887e968b905f7e97b9edd7f145540e718b3e88da40077a8095"} Apr 16 18:10:02.904745 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.904726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-snh2z" event={"ID":"32938f07-abc4-4f36-9ffc-6472e5b05222","Type":"ContainerStarted","Data":"d0f99f62523d78b3631845da66d029c811314428bb516ebeaf5dc422120247ec"} Apr 16 18:10:02.905691 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.905668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d24gh" event={"ID":"07c55bf2-6978-4a9f-ace9-a376fd86df33","Type":"ContainerStarted","Data":"19a383c18ec2b4ef776ce2d799b440855eb9c52b796d70b8e6b374f520c651e9"} Apr 16 18:10:02.906643 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.906620 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" event={"ID":"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a","Type":"ContainerStarted","Data":"f5a438799ea694281e0961fa1b50e55c5c6dc180de404937a6a98d693424ac1f"} Apr 16 18:10:02.907718 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.907696 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fkzwr" event={"ID":"8670fc01-fa98-49d3-8a70-5fe409cb46a1","Type":"ContainerStarted","Data":"885a2e326acd9dcfe61319d7658f48445019e787eec3934cdab8f2c65dc9d4ad"} Apr 16 18:10:02.908637 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.908613 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" event={"ID":"ea8057f2-eb32-4731-8eae-88959bceb760","Type":"ContainerStarted","Data":"82e15fffe894ca113243a310321fb77ea97be98041ebc3587cfa56da97af10bd"} Apr 16 18:10:02.909685 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.909656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lsx75" event={"ID":"7a7a48be-c9f2-4cfa-8741-c0bfba5190ba","Type":"ContainerStarted","Data":"5bfee6824eab4ef470867dfbc7854b9fcaa4b664ed32d562d6da1d1e27207d2d"} Apr 16 18:10:02.910567 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:02.910549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xcxgk" event={"ID":"969cd886-bdfc-46ca-ab57-08cca0abed0b","Type":"ContainerStarted","Data":"3e39381ab8583963bc10193717afceaca617a3715f13ed8466dd5d52de918283"} Apr 16 18:10:03.388369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:03.388091 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:03.388552 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:03.388414 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:03.388552 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:03.388429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs\") pod \"network-metrics-daemon-687m2\" (UID: \"abb881c2-5bd6-4f73-a490-2665c9449ae7\") " pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:03.388552 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:03.388492 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret podName:b097be47-4dfd-4fcd-a8ab-78a2cd491538 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:05.388472344 +0000 UTC m=+6.091803858 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret") pod "global-pull-secret-syncer-slcjb" (UID: "b097be47-4dfd-4fcd-a8ab-78a2cd491538") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:03.388552 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:03.388537 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:03.388773 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:03.388588 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs podName:abb881c2-5bd6-4f73-a490-2665c9449ae7 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:05.388571684 +0000 UTC m=+6.091903203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs") pod "network-metrics-daemon-687m2" (UID: "abb881c2-5bd6-4f73-a490-2665c9449ae7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:03.491051 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:03.489084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqz7\" (UniqueName: \"kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7\") pod \"network-check-target-qhbm6\" (UID: \"795df744-5070-4941-a2b6-f01fc85241b9\") " pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:03.491051 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:03.489308 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:03.491051 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:03.489329 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:03.491051 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:03.489344 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ksqz7 for pod openshift-network-diagnostics/network-check-target-qhbm6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:03.491051 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:03.489445 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7 podName:795df744-5070-4941-a2b6-f01fc85241b9 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:05.489423385 +0000 UTC m=+6.192754886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ksqz7" (UniqueName: "kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7") pod "network-check-target-qhbm6" (UID: "795df744-5070-4941-a2b6-f01fc85241b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:03.897419 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:03.897380 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:03.897953 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:03.897514 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:03.897953 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:03.897904 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:03.898076 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:03.897985 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:03.898076 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:03.898056 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:03.898175 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:03.898125 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:03.929684 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:03.929479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-51.ec2.internal" event={"ID":"9a7b007f7f10e6a8374521e6218a633f","Type":"ContainerStarted","Data":"0188376c1c32c153267ded9802badac624fa2fb87542bf2d3f3c6a53b75b2154"} Apr 16 18:10:03.938911 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:03.938883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" event={"ID":"9de07c5d9aedf251a72fc754467481ac","Type":"ContainerStarted","Data":"abbf3fb3b1f1428747c49efcff9a9281b1d035d1a158098ac8454738abfa5992"} Apr 16 18:10:03.943596 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:03.943549 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-51.ec2.internal" podStartSLOduration=3.943532997 podStartE2EDuration="3.943532997s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:03.942397362 +0000 UTC m=+4.645728889" watchObservedRunningTime="2026-04-16 18:10:03.943532997 +0000 UTC m=+4.646864514" Apr 16 18:10:04.949471 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:04.949163 2576 generic.go:358] "Generic (PLEG): container finished" podID="9de07c5d9aedf251a72fc754467481ac" containerID="abbf3fb3b1f1428747c49efcff9a9281b1d035d1a158098ac8454738abfa5992" exitCode=0 Apr 16 18:10:04.949471 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:04.949331 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" event={"ID":"9de07c5d9aedf251a72fc754467481ac","Type":"ContainerDied","Data":"abbf3fb3b1f1428747c49efcff9a9281b1d035d1a158098ac8454738abfa5992"} Apr 16 18:10:05.406293 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:05.406257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:05.406482 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:05.406334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs\") pod \"network-metrics-daemon-687m2\" (UID: \"abb881c2-5bd6-4f73-a490-2665c9449ae7\") " pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:05.406538 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:05.406493 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:05.406587 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:05.406555 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs podName:abb881c2-5bd6-4f73-a490-2665c9449ae7 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:09.406537041 +0000 UTC m=+10.109868566 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs") pod "network-metrics-daemon-687m2" (UID: "abb881c2-5bd6-4f73-a490-2665c9449ae7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:05.406945 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:05.406924 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:05.407037 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:05.406976 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret podName:b097be47-4dfd-4fcd-a8ab-78a2cd491538 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:09.406961691 +0000 UTC m=+10.110293195 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret") pod "global-pull-secret-syncer-slcjb" (UID: "b097be47-4dfd-4fcd-a8ab-78a2cd491538") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:05.507151 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:05.507118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqz7\" (UniqueName: \"kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7\") pod \"network-check-target-qhbm6\" (UID: \"795df744-5070-4941-a2b6-f01fc85241b9\") " pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:05.507353 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:05.507308 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:05.507353 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:05.507333 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:05.507353 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:05.507345 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ksqz7 for pod openshift-network-diagnostics/network-check-target-qhbm6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:05.507516 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:05.507407 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7 podName:795df744-5070-4941-a2b6-f01fc85241b9 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:09.507387747 +0000 UTC m=+10.210719255 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ksqz7" (UniqueName: "kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7") pod "network-check-target-qhbm6" (UID: "795df744-5070-4941-a2b6-f01fc85241b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:05.894698 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:05.894669 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:05.894884 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:05.894791 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:05.895121 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:05.895105 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:05.895199 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:05.895173 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:05.897121 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:05.897098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:05.897231 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:05.897204 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:07.894607 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:07.894528 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:07.894607 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:07.894538 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:07.895126 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:07.894704 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:07.900790 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:07.897975 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:07.900790 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:07.898735 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:07.900790 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:07.898896 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:09.439300 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:09.438546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs\") pod \"network-metrics-daemon-687m2\" (UID: \"abb881c2-5bd6-4f73-a490-2665c9449ae7\") " pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:09.439300 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:09.438624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:09.439300 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:09.438755 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:09.439300 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:09.438816 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret podName:b097be47-4dfd-4fcd-a8ab-78a2cd491538 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.438797914 +0000 UTC m=+18.142129431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret") pod "global-pull-secret-syncer-slcjb" (UID: "b097be47-4dfd-4fcd-a8ab-78a2cd491538") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:09.439300 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:09.439199 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:09.439300 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:09.439264 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs podName:abb881c2-5bd6-4f73-a490-2665c9449ae7 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.439232115 +0000 UTC m=+18.142563618 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs") pod "network-metrics-daemon-687m2" (UID: "abb881c2-5bd6-4f73-a490-2665c9449ae7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:09.539959 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:09.539920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqz7\" (UniqueName: \"kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7\") pod \"network-check-target-qhbm6\" (UID: \"795df744-5070-4941-a2b6-f01fc85241b9\") " pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:09.540139 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:09.540087 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:09.540139 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:09.540109 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:09.540139 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:09.540123 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ksqz7 for pod openshift-network-diagnostics/network-check-target-qhbm6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:09.540338 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:09.540183 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7 podName:795df744-5070-4941-a2b6-f01fc85241b9 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.540166729 +0000 UTC m=+18.243498247 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ksqz7" (UniqueName: "kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7") pod "network-check-target-qhbm6" (UID: "795df744-5070-4941-a2b6-f01fc85241b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:09.896235 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:09.895936 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:09.896235 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:09.895981 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:09.896235 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:09.896069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:09.896235 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:09.896136 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:09.896235 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:09.896140 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:09.896235 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:09.896201 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:10.963282 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:10.963223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" event={"ID":"9de07c5d9aedf251a72fc754467481ac","Type":"ContainerStarted","Data":"f028d53ad0ab90eec5853635e6ed8d8935125ba7c74207d2d2a70dbf254e3187"} Apr 16 18:10:10.978177 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:10.977673 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-51.ec2.internal" podStartSLOduration=10.977655293 podStartE2EDuration="10.977655293s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:10.977156919 +0000 UTC m=+11.680488444" watchObservedRunningTime="2026-04-16 18:10:10.977655293 +0000 UTC m=+11.680986818" Apr 16 18:10:11.894868 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:11.894829 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:11.895913 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:11.895664 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:11.896055 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:11.895676 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:11.896055 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:11.895882 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:11.896204 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:11.896161 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:11.896310 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:11.896286 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:13.896849 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:13.896796 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:13.897318 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:13.896801 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:13.897318 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:13.896916 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:13.897318 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:13.896801 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:13.897318 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:13.896986 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:13.897318 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:13.897064 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:15.894428 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:15.894395 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:15.894875 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:15.894521 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:15.894875 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:15.894540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:15.894875 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:15.894573 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:15.894875 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:15.894628 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:15.894875 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:15.894704 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:17.505547 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:17.505505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:17.506016 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:17.505586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs\") pod \"network-metrics-daemon-687m2\" (UID: \"abb881c2-5bd6-4f73-a490-2665c9449ae7\") " pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:17.506016 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:17.505671 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:17.506016 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:17.505710 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:17.506016 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:17.505740 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret podName:b097be47-4dfd-4fcd-a8ab-78a2cd491538 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:33.505724419 +0000 UTC m=+34.209055921 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret") pod "global-pull-secret-syncer-slcjb" (UID: "b097be47-4dfd-4fcd-a8ab-78a2cd491538") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:17.506016 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:17.505757 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs podName:abb881c2-5bd6-4f73-a490-2665c9449ae7 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:33.50575069 +0000 UTC m=+34.209082192 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs") pod "network-metrics-daemon-687m2" (UID: "abb881c2-5bd6-4f73-a490-2665c9449ae7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:17.606684 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:17.606641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqz7\" (UniqueName: \"kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7\") pod \"network-check-target-qhbm6\" (UID: \"795df744-5070-4941-a2b6-f01fc85241b9\") " pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:17.606858 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:17.606839 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:17.606909 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:17.606860 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:17.606909 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:17.606870 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ksqz7 for pod openshift-network-diagnostics/network-check-target-qhbm6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:17.606968 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:17.606926 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7 podName:795df744-5070-4941-a2b6-f01fc85241b9 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:33.60690976 +0000 UTC m=+34.310241262 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ksqz7" (UniqueName: "kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7") pod "network-check-target-qhbm6" (UID: "795df744-5070-4941-a2b6-f01fc85241b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:17.895063 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:17.895028 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:17.895063 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:17.895051 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:17.895318 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:17.895029 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:17.895318 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:17.895166 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:17.895318 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:17.895262 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:17.895474 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:17.895393 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:19.895514 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:19.895477 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:19.895915 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:19.895571 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:19.895915 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:19.895653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:19.895915 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:19.895749 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:19.895915 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:19.895791 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:19.895915 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:19.895859 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:20.979503 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.978974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xcxgk" event={"ID":"969cd886-bdfc-46ca-ab57-08cca0abed0b","Type":"ContainerStarted","Data":"ae16aa0caad959b33627d18bfbb4d5b22ded5f61d83ea3b52e3b7f74b9913b38"} Apr 16 18:10:20.982110 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.982085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" event={"ID":"ced49c9c-a339-4f7e-9970-0aabc7b2d765","Type":"ContainerStarted","Data":"8a4bc175c69efebed0726399929c0b7d5369a5a8fa003f31827cedc57d895f56"} Apr 16 18:10:20.982238 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.982116 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" event={"ID":"ced49c9c-a339-4f7e-9970-0aabc7b2d765","Type":"ContainerStarted","Data":"e14dbb809e60320e5c8ca1978cda6170d7d8b4cf709dda591d834c460134db13"} Apr 16 18:10:20.982238 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.982129 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" event={"ID":"ced49c9c-a339-4f7e-9970-0aabc7b2d765","Type":"ContainerStarted","Data":"f3c7d65aa219fc0332f3ec4bbe6d54385cb5a78f0f7048643248fa823fc5dfa0"} Apr 16 18:10:20.982238 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.982141 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" event={"ID":"ced49c9c-a339-4f7e-9970-0aabc7b2d765","Type":"ContainerStarted","Data":"52fc5eec9b6c7ee64cf44a3de6ece2155dc6e51942430f59bd894fc4efcfe188"} Apr 16 18:10:20.982238 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.982154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" event={"ID":"ced49c9c-a339-4f7e-9970-0aabc7b2d765","Type":"ContainerStarted","Data":"ddfed43dcbee0c47893d3563234c57e1ca250fd3b5b51c2f4adf19c5b6d0056f"} Apr 16 18:10:20.982238 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.982166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" event={"ID":"ced49c9c-a339-4f7e-9970-0aabc7b2d765","Type":"ContainerStarted","Data":"be32a73ce06a2aa4539e64df9e97238ccdb7a9c5fd41e24cc9bc15594ebe3da6"} Apr 16 18:10:20.983452 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.983429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tqcl9" event={"ID":"62ab651d-790e-48bc-91c8-e40eada59965","Type":"ContainerStarted","Data":"ae32655ebe39944b10904778a8c5be6f19b6b28c1f901b2f707f94b2475573b7"} Apr 16 18:10:20.984820 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.984800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d24gh" event={"ID":"07c55bf2-6978-4a9f-ace9-a376fd86df33","Type":"ContainerStarted","Data":"e7d50aba1d29bdd297254494589b18dfb492ffcb3a445977b304d01d5bfc1f9f"} Apr 16 18:10:20.986128 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.986085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" event={"ID":"fd0b27ea-1963-4d05-88c4-3dced0ebdf3a","Type":"ContainerStarted","Data":"b8a7343fbca6a848bd6186f34d5787984cb84d55f9a699068c9ca191c7979bbe"} Apr 16 18:10:20.987462 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.987443 2576 generic.go:358] "Generic (PLEG): container finished" podID="8670fc01-fa98-49d3-8a70-5fe409cb46a1" containerID="e42f981997eeb83133388ad536deadf2354932ce028f8de651f24a5629671900" exitCode=0 Apr 16 18:10:20.987561 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.987506 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fkzwr" event={"ID":"8670fc01-fa98-49d3-8a70-5fe409cb46a1","Type":"ContainerDied","Data":"e42f981997eeb83133388ad536deadf2354932ce028f8de651f24a5629671900"} Apr 16 18:10:20.988790 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.988763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" event={"ID":"ea8057f2-eb32-4731-8eae-88959bceb760","Type":"ContainerStarted","Data":"bd123757758411d16d57a3dbd5f80a25fbcba41c4910bb97602da2c9a8b65603"} Apr 16 18:10:20.990051 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.990029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lsx75" event={"ID":"7a7a48be-c9f2-4cfa-8741-c0bfba5190ba","Type":"ContainerStarted","Data":"ea9dc9b9d2365ce795ac8621bbed9921345e9db1f97f6a85edb1610149ba3153"} Apr 16 18:10:20.992522 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:20.992472 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xcxgk" podStartSLOduration=9.507267941 podStartE2EDuration="21.992457198s" podCreationTimestamp="2026-04-16 18:09:59 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.868382583 +0000 UTC m=+3.571714091" lastFinishedPulling="2026-04-16 18:10:15.353571844 +0000 UTC m=+16.056903348" observedRunningTime="2026-04-16 18:10:20.99194161 +0000 UTC m=+21.695273135" watchObservedRunningTime="2026-04-16 18:10:20.992457198 +0000 UTC m=+21.695788723" Apr 16 18:10:21.017436 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:21.017380 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tqcl9" podStartSLOduration=4.860658953 podStartE2EDuration="22.017364473s" podCreationTimestamp="2026-04-16 18:09:59 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.88030678 +0000 UTC m=+3.583638285" lastFinishedPulling="2026-04-16 18:10:20.037012303 +0000 UTC m=+20.740343805" observedRunningTime="2026-04-16 18:10:21.016630741 +0000 UTC m=+21.719962267" watchObservedRunningTime="2026-04-16 18:10:21.017364473 +0000 UTC m=+21.720695998" Apr 16 18:10:21.049646 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:21.049603 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d24gh" podStartSLOduration=3.843864224 podStartE2EDuration="21.049589332s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.877128132 +0000 UTC m=+3.580459643" lastFinishedPulling="2026-04-16 18:10:20.082853246 +0000 UTC m=+20.786184751" observedRunningTime="2026-04-16 18:10:21.049338752 +0000 UTC m=+21.752670276" watchObservedRunningTime="2026-04-16 18:10:21.049589332 +0000 UTC m=+21.752920856" Apr 16 18:10:21.061496 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:21.061455 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lsx75" podStartSLOduration=4.894469851 podStartE2EDuration="22.061443082s" podCreationTimestamp="2026-04-16 18:09:59 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.869759219 +0000 UTC m=+3.573090720" lastFinishedPulling="2026-04-16 18:10:20.036732439 +0000 UTC m=+20.740063951" observedRunningTime="2026-04-16 18:10:21.061168326 +0000 UTC m=+21.764499840" watchObservedRunningTime="2026-04-16 18:10:21.061443082 +0000 UTC m=+21.764774605" Apr 16 18:10:21.218988 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:21.218963 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:10:21.830643 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:21.830484 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:10:21.218986343Z","UUID":"5cdf02b0-c906-4472-b170-699b8e88cd8b","Handler":null,"Name":"","Endpoint":""} Apr 16 18:10:21.832491 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:21.832450 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:10:21.832491 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:21.832482 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:10:21.894737 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:21.894709 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:21.894927 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:21.894710 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:21.894927 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:21.894820 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:21.894927 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:21.894833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:21.894927 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:21.894903 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:21.895120 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:21.894997 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:21.993153 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:21.993111 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-snh2z" event={"ID":"32938f07-abc4-4f36-9ffc-6472e5b05222","Type":"ContainerStarted","Data":"deeace4a680437e8a23c950cf4daccecb02e0939c9200d4c96aa3eea03bbcc4b"} Apr 16 18:10:21.995070 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:21.995034 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" event={"ID":"ea8057f2-eb32-4731-8eae-88959bceb760","Type":"ContainerStarted","Data":"da427979831471854d62c3c08ecb2c1c45bd419af97e200468bbfecea6ed2548"} Apr 16 18:10:22.005901 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:22.005802 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-snh2z" podStartSLOduration=5.84819319 podStartE2EDuration="23.005787012s" podCreationTimestamp="2026-04-16 18:09:59 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.879099956 +0000 UTC m=+3.582431461" lastFinishedPulling="2026-04-16 18:10:20.036693775 +0000 UTC m=+20.740025283" observedRunningTime="2026-04-16 18:10:22.005349818 +0000 UTC m=+22.708681343" watchObservedRunningTime="2026-04-16 18:10:22.005787012 +0000 UTC m=+22.709118550" Apr 16 18:10:22.006021 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:22.005986 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xrwkr" podStartSLOduration=5.84416008 podStartE2EDuration="23.005979465s" podCreationTimestamp="2026-04-16 18:09:59 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.875300075 +0000 UTC m=+3.578631577" lastFinishedPulling="2026-04-16 18:10:20.037119449 +0000 UTC m=+20.740450962" observedRunningTime="2026-04-16 18:10:21.080105756 +0000 UTC m=+21.783437280" watchObservedRunningTime="2026-04-16 18:10:22.005979465 +0000 UTC m=+22.709310988" Apr 16 18:10:22.999378 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:22.999268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" event={"ID":"ea8057f2-eb32-4731-8eae-88959bceb760","Type":"ContainerStarted","Data":"594ba24798316cbe3af2253fc6cc9ab1fbffb2f715ca5616836e5030e9e28557"} Apr 16 18:10:23.002422 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:23.002396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" event={"ID":"ced49c9c-a339-4f7e-9970-0aabc7b2d765","Type":"ContainerStarted","Data":"de7ff9f66cbacc41fb87be88cea75c1a39f971e2014dd8ee8c314d210f90b8de"} Apr 16 18:10:23.016992 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:23.016939 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5mjbz" podStartSLOduration=3.891114843 podStartE2EDuration="23.016924001s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.873589827 +0000 UTC m=+3.576921329" lastFinishedPulling="2026-04-16 18:10:21.999398969 +0000 UTC m=+22.702730487" observedRunningTime="2026-04-16 18:10:23.016409495 +0000 UTC m=+23.719741020" watchObservedRunningTime="2026-04-16 18:10:23.016924001 +0000 UTC m=+23.720255524" Apr 16 18:10:23.895202 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:23.895169 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:23.895396 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:23.895302 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:23.895396 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:23.895175 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:23.895396 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:23.895169 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:23.895556 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:23.895402 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:23.895556 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:23.895490 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:25.009624 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:25.009448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" event={"ID":"ced49c9c-a339-4f7e-9970-0aabc7b2d765","Type":"ContainerStarted","Data":"3908ef4c4a29b5912002bd98bdeb9c995d1874de8f073acc69f50eb7c1bff2f0"} Apr 16 18:10:25.010173 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:25.009816 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:25.026588 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:25.026559 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:25.037728 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:25.037674 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" podStartSLOduration=7.767358711 podStartE2EDuration="25.037658356s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.866533918 +0000 UTC m=+3.569865423" lastFinishedPulling="2026-04-16 18:10:20.136833562 +0000 UTC m=+20.840165068" observedRunningTime="2026-04-16 18:10:25.036727251 +0000 UTC m=+25.740058774" watchObservedRunningTime="2026-04-16 18:10:25.037658356 +0000 UTC m=+25.740989880" Apr 16 18:10:25.179622 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:25.179595 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lsx75" Apr 16 18:10:25.180219 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:25.180199 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lsx75" Apr 16 18:10:25.894338 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:25.894308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:25.894499 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:25.894308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:25.894499 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:25.894424 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:25.894499 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:25.894308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:25.894637 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:25.894466 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:25.894637 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:25.894553 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:26.012027 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:26.011994 2576 generic.go:358] "Generic (PLEG): container finished" podID="8670fc01-fa98-49d3-8a70-5fe409cb46a1" containerID="34f4615d3c25ae97a83975e9c5b552355dda4b50a4c2bf5e392689616a8132d3" exitCode=0 Apr 16 18:10:26.012681 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:26.012088 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fkzwr" event={"ID":"8670fc01-fa98-49d3-8a70-5fe409cb46a1","Type":"ContainerDied","Data":"34f4615d3c25ae97a83975e9c5b552355dda4b50a4c2bf5e392689616a8132d3"} Apr 16 18:10:26.012681 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:26.012214 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lsx75" Apr 16 18:10:26.013163 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:26.012872 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lsx75" Apr 16 18:10:26.013163 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:26.012894 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:26.013163 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:26.012907 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:26.027159 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:26.027141 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:27.088578 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:27.088339 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-slcjb"] Apr 16 18:10:27.089375 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:27.088674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:27.089375 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:27.088782 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:27.091221 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:27.091131 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-687m2"] Apr 16 18:10:27.096064 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:27.092486 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:27.096064 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:27.092652 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:27.096064 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:27.093869 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qhbm6"] Apr 16 18:10:27.096064 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:27.093974 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:27.096064 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:27.094074 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:27.741411 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:27.741386 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tqcl9_62ab651d-790e-48bc-91c8-e40eada59965/dns-node-resolver/0.log" Apr 16 18:10:28.016762 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:28.016680 2576 generic.go:358] "Generic (PLEG): container finished" podID="8670fc01-fa98-49d3-8a70-5fe409cb46a1" containerID="b22e283aa0a0f4cca4a2645b6e96b15a9da6d9d6ac09e94770d90a97de277261" exitCode=0 Apr 16 18:10:28.016883 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:28.016764 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fkzwr" event={"ID":"8670fc01-fa98-49d3-8a70-5fe409cb46a1","Type":"ContainerDied","Data":"b22e283aa0a0f4cca4a2645b6e96b15a9da6d9d6ac09e94770d90a97de277261"} Apr 16 18:10:28.726330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:28.726306 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xcxgk_969cd886-bdfc-46ca-ab57-08cca0abed0b/node-ca/0.log" Apr 16 18:10:28.894110 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:28.894080 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:28.894110 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:28.894110 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:28.894340 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:28.894121 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:28.894340 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:28.894218 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:28.894439 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:28.894338 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:28.894486 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:28.894431 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:30.022068 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:30.022029 2576 generic.go:358] "Generic (PLEG): container finished" podID="8670fc01-fa98-49d3-8a70-5fe409cb46a1" containerID="a89b13858e20c4f4ffc68cf0f6535e884b692c3e8d428c72dcad00391f887824" exitCode=0 Apr 16 18:10:30.022486 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:30.022092 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fkzwr" event={"ID":"8670fc01-fa98-49d3-8a70-5fe409cb46a1","Type":"ContainerDied","Data":"a89b13858e20c4f4ffc68cf0f6535e884b692c3e8d428c72dcad00391f887824"} Apr 16 18:10:30.895122 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:30.895091 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:30.895122 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:30.895108 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:30.895338 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:30.895092 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:30.895338 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:30.895256 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:30.895456 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:30.895338 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:30.895456 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:30.895393 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:32.894950 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:32.894880 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:32.894950 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:32.894905 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:32.895635 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:32.894882 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:32.895635 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:32.895023 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:32.895635 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:32.895079 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:32.895635 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:32.895163 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:33.522548 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:33.522509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:33.522740 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:33.522568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs\") pod \"network-metrics-daemon-687m2\" (UID: \"abb881c2-5bd6-4f73-a490-2665c9449ae7\") " pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:33.522740 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:33.522657 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:33.522740 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:33.522685 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:33.522740 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:33.522708 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs podName:abb881c2-5bd6-4f73-a490-2665c9449ae7 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:05.522695043 +0000 UTC m=+66.226026545 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs") pod "network-metrics-daemon-687m2" (UID: "abb881c2-5bd6-4f73-a490-2665c9449ae7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:33.522898 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:33.522747 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret podName:b097be47-4dfd-4fcd-a8ab-78a2cd491538 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:05.522729668 +0000 UTC m=+66.226061173 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret") pod "global-pull-secret-syncer-slcjb" (UID: "b097be47-4dfd-4fcd-a8ab-78a2cd491538") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:10:33.623156 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:33.623118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqz7\" (UniqueName: \"kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7\") pod \"network-check-target-qhbm6\" (UID: \"795df744-5070-4941-a2b6-f01fc85241b9\") " pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:33.623362 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:33.623275 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:33.623362 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:33.623290 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:33.623362 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:33.623299 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ksqz7 for pod openshift-network-diagnostics/network-check-target-qhbm6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:33.623362 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:33.623353 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7 podName:795df744-5070-4941-a2b6-f01fc85241b9 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:05.623338305 +0000 UTC m=+66.326669831 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ksqz7" (UniqueName: "kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7") pod "network-check-target-qhbm6" (UID: "795df744-5070-4941-a2b6-f01fc85241b9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:34.895136 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:34.895102 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:34.895609 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:34.895102 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:34.895609 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:34.895218 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:34.895609 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:34.895113 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:34.895609 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:34.895295 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:34.895609 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:34.895394 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:36.894555 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:36.894369 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:36.894912 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:36.894369 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:36.894912 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:36.894644 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:36.894912 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:36.894734 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:36.894912 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:36.894369 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:36.894912 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:36.894827 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:37.039318 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:37.039286 2576 generic.go:358] "Generic (PLEG): container finished" podID="8670fc01-fa98-49d3-8a70-5fe409cb46a1" containerID="9983233b5d8aeeb0001afee6ca2c3a3fb6e6019162fe6172c6c4955b6b3fc656" exitCode=0 Apr 16 18:10:37.039467 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:37.039336 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fkzwr" event={"ID":"8670fc01-fa98-49d3-8a70-5fe409cb46a1","Type":"ContainerDied","Data":"9983233b5d8aeeb0001afee6ca2c3a3fb6e6019162fe6172c6c4955b6b3fc656"} Apr 16 18:10:38.043656 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:38.043624 2576 generic.go:358] "Generic (PLEG): container finished" podID="8670fc01-fa98-49d3-8a70-5fe409cb46a1" containerID="0d468f7821e3d49850164a7333880ca130d35ee7c3edd9612dc77eefc0c8cdce" exitCode=0 Apr 16 18:10:38.044208 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:38.043681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fkzwr" event={"ID":"8670fc01-fa98-49d3-8a70-5fe409cb46a1","Type":"ContainerDied","Data":"0d468f7821e3d49850164a7333880ca130d35ee7c3edd9612dc77eefc0c8cdce"} Apr 16 18:10:38.895099 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:38.895060 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:38.895329 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:38.895067 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:38.895329 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:38.895159 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:38.895329 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:38.895073 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:38.895329 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:38.895274 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:38.895329 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:38.895323 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:39.048800 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:39.048762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fkzwr" event={"ID":"8670fc01-fa98-49d3-8a70-5fe409cb46a1","Type":"ContainerStarted","Data":"c49d858eed44f6a75980dad80f2577a73751385e7da99e6d4d91c94b576f23bd"} Apr 16 18:10:39.071186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:39.071136 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fkzwr" podStartSLOduration=6.883811974 podStartE2EDuration="40.071122559s" podCreationTimestamp="2026-04-16 18:09:59 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.875082906 +0000 UTC m=+3.578414419" lastFinishedPulling="2026-04-16 18:10:36.062393487 +0000 UTC m=+36.765725004" observedRunningTime="2026-04-16 18:10:39.071078744 +0000 UTC m=+39.774410268" watchObservedRunningTime="2026-04-16 18:10:39.071122559 +0000 UTC m=+39.774454061" Apr 16 18:10:40.894797 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:40.894764 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:40.895265 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:40.894764 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:40.895265 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:40.894874 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:40.895265 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:40.894965 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:40.895265 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:40.894764 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:40.895265 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:40.895083 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:42.894996 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:42.894947 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:42.894996 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:42.894994 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:42.895450 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:42.895038 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:42.895450 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:42.895134 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:42.895450 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:42.895232 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:42.895450 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:42.895320 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:44.895111 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:44.895070 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:44.895111 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:44.895093 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:44.895567 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:44.895196 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:44.895567 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:44.895278 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:44.895567 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:44.895275 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:44.895567 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:44.895365 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:46.894349 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:46.894321 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:46.894827 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:46.894324 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:46.894827 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:46.894433 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:46.894827 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:46.894330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:46.894827 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:46.894514 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:46.894827 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:46.894583 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:48.894206 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:48.894173 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:48.894645 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:48.894179 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:48.894645 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:48.894285 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:48.894645 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:48.894179 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:48.894645 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:48.894372 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:48.894645 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:48.894464 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:50.895008 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:50.894977 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:50.895397 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:50.895081 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:50.895397 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:50.895091 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:50.895397 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:50.895167 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:50.895397 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:50.895204 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:50.895397 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:50.895284 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:52.894570 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:52.894538 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:52.894570 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:52.894557 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:52.894968 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:52.894541 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:52.894968 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:52.894642 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qhbm6" podUID="795df744-5070-4941-a2b6-f01fc85241b9" Apr 16 18:10:52.894968 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:52.894712 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-687m2" podUID="abb881c2-5bd6-4f73-a490-2665c9449ae7" Apr 16 18:10:52.894968 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:10:52.894778 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-slcjb" podUID="b097be47-4dfd-4fcd-a8ab-78a2cd491538" Apr 16 18:10:53.053841 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.053762 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-51.ec2.internal" event="NodeReady" Apr 16 18:10:53.054010 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.053915 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:10:53.087305 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.087272 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-f98dfdd99-ll2gn"] Apr 16 18:10:53.107978 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.107947 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f98dfdd99-ll2gn"] Apr 16 18:10:53.107978 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.107976 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-glzcp"] Apr 16 18:10:53.108151 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.108098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.110219 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.110182 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:10:53.110397 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.110302 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v2fsg\"" Apr 16 18:10:53.110397 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.110335 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:10:53.110397 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.110373 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:10:53.115859 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.115505 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:10:53.126464 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.126382 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jxhl9"] Apr 16 18:10:53.126619 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.126491 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-glzcp" Apr 16 18:10:53.131992 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.131849 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:10:53.131992 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.131875 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7tj5h\"" Apr 16 18:10:53.131992 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.131889 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:10:53.131992 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.131929 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:10:53.150307 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.150283 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-glzcp"] Apr 16 18:10:53.150416 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.150315 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-v2wwf"] Apr 16 18:10:53.150416 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.150407 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.152535 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.152514 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:10:53.152669 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.152606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xkg79\"" Apr 16 18:10:53.152728 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.152715 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:10:53.167605 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.167576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75b1346f-cd61-4f00-9522-07bd0a521fc6-bound-sa-token\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.167705 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.167616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/206d1627-bacb-468e-ab0d-e0782a1f13d1-metrics-tls\") pod \"dns-default-jxhl9\" (UID: \"206d1627-bacb-468e-ab0d-e0782a1f13d1\") " pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.167705 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.167641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h79n9\" (UniqueName: \"kubernetes.io/projected/206d1627-bacb-468e-ab0d-e0782a1f13d1-kube-api-access-h79n9\") pod \"dns-default-jxhl9\" (UID: \"206d1627-bacb-468e-ab0d-e0782a1f13d1\") " pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.167783 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.167725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcjx5\" (UniqueName: \"kubernetes.io/projected/75b1346f-cd61-4f00-9522-07bd0a521fc6-kube-api-access-pcjx5\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.167783 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.167762 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/75b1346f-cd61-4f00-9522-07bd0a521fc6-image-registry-private-configuration\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.167860 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.167792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/206d1627-bacb-468e-ab0d-e0782a1f13d1-config-volume\") pod \"dns-default-jxhl9\" (UID: \"206d1627-bacb-468e-ab0d-e0782a1f13d1\") " pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.167860 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.167807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gwvs\" (UniqueName: \"kubernetes.io/projected/33d3f9c8-7acd-4da5-93e7-1274c864ad1c-kube-api-access-8gwvs\") pod \"ingress-canary-glzcp\" (UID: \"33d3f9c8-7acd-4da5-93e7-1274c864ad1c\") " pod="openshift-ingress-canary/ingress-canary-glzcp" Apr 16 18:10:53.167860 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.167825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75b1346f-cd61-4f00-9522-07bd0a521fc6-registry-certificates\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.167860 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.167842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75b1346f-cd61-4f00-9522-07bd0a521fc6-trusted-ca\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.167979 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.167900 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75b1346f-cd61-4f00-9522-07bd0a521fc6-registry-tls\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.167979 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.167922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33d3f9c8-7acd-4da5-93e7-1274c864ad1c-cert\") pod \"ingress-canary-glzcp\" (UID: \"33d3f9c8-7acd-4da5-93e7-1274c864ad1c\") " pod="openshift-ingress-canary/ingress-canary-glzcp" Apr 16 18:10:53.167979 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.167959 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75b1346f-cd61-4f00-9522-07bd0a521fc6-ca-trust-extracted\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.168062 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.167979 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75b1346f-cd61-4f00-9522-07bd0a521fc6-installation-pull-secrets\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.168062 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.168018 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/206d1627-bacb-468e-ab0d-e0782a1f13d1-tmp-dir\") pod \"dns-default-jxhl9\" (UID: \"206d1627-bacb-468e-ab0d-e0782a1f13d1\") " pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.173811 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.173796 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jxhl9"] Apr 16 18:10:53.173847 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.173821 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v2wwf"] Apr 16 18:10:53.173935 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.173925 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.176081 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.176026 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:10:53.176180 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.176085 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:10:53.176180 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.176139 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bxtnw\"" Apr 16 18:10:53.176180 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.176163 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:10:53.176357 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.176142 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:10:53.268694 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.268657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/206d1627-bacb-468e-ab0d-e0782a1f13d1-config-volume\") pod \"dns-default-jxhl9\" (UID: \"206d1627-bacb-468e-ab0d-e0782a1f13d1\") " pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.268694 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.268696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gwvs\" (UniqueName: \"kubernetes.io/projected/33d3f9c8-7acd-4da5-93e7-1274c864ad1c-kube-api-access-8gwvs\") pod \"ingress-canary-glzcp\" (UID: \"33d3f9c8-7acd-4da5-93e7-1274c864ad1c\") " pod="openshift-ingress-canary/ingress-canary-glzcp" Apr 16 18:10:53.268915 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.268715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75b1346f-cd61-4f00-9522-07bd0a521fc6-registry-certificates\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.268915 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.268743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.268915 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.268765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75b1346f-cd61-4f00-9522-07bd0a521fc6-trusted-ca\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.268915 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.268884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75b1346f-cd61-4f00-9522-07bd0a521fc6-registry-tls\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.269126 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.268922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33d3f9c8-7acd-4da5-93e7-1274c864ad1c-cert\") pod \"ingress-canary-glzcp\" (UID: \"33d3f9c8-7acd-4da5-93e7-1274c864ad1c\") " pod="openshift-ingress-canary/ingress-canary-glzcp" Apr 16 18:10:53.269126 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.268956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75b1346f-cd61-4f00-9522-07bd0a521fc6-ca-trust-extracted\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.269126 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.268983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75b1346f-cd61-4f00-9522-07bd0a521fc6-installation-pull-secrets\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.269126 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/206d1627-bacb-468e-ab0d-e0782a1f13d1-tmp-dir\") pod \"dns-default-jxhl9\" (UID: \"206d1627-bacb-468e-ab0d-e0782a1f13d1\") " pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.269126 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s7xq\" (UniqueName: \"kubernetes.io/projected/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-kube-api-access-8s7xq\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.269126 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269091 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75b1346f-cd61-4f00-9522-07bd0a521fc6-bound-sa-token\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.269126 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-crio-socket\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.269427 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/206d1627-bacb-468e-ab0d-e0782a1f13d1-metrics-tls\") pod \"dns-default-jxhl9\" (UID: \"206d1627-bacb-468e-ab0d-e0782a1f13d1\") " pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.269427 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269183 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h79n9\" (UniqueName: \"kubernetes.io/projected/206d1627-bacb-468e-ab0d-e0782a1f13d1-kube-api-access-h79n9\") pod \"dns-default-jxhl9\" (UID: \"206d1627-bacb-468e-ab0d-e0782a1f13d1\") " pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.269427 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.269427 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcjx5\" (UniqueName: \"kubernetes.io/projected/75b1346f-cd61-4f00-9522-07bd0a521fc6-kube-api-access-pcjx5\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.269427 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/206d1627-bacb-468e-ab0d-e0782a1f13d1-config-volume\") pod \"dns-default-jxhl9\" (UID: \"206d1627-bacb-468e-ab0d-e0782a1f13d1\") " pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.269427 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/75b1346f-cd61-4f00-9522-07bd0a521fc6-image-registry-private-configuration\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.269427 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75b1346f-cd61-4f00-9522-07bd0a521fc6-ca-trust-extracted\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.269786 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-data-volume\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.269786 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/206d1627-bacb-468e-ab0d-e0782a1f13d1-tmp-dir\") pod \"dns-default-jxhl9\" (UID: \"206d1627-bacb-468e-ab0d-e0782a1f13d1\") " pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.269786 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.269670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75b1346f-cd61-4f00-9522-07bd0a521fc6-registry-certificates\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.270481 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.270461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75b1346f-cd61-4f00-9522-07bd0a521fc6-trusted-ca\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.273213 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.273190 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/206d1627-bacb-468e-ab0d-e0782a1f13d1-metrics-tls\") pod \"dns-default-jxhl9\" (UID: \"206d1627-bacb-468e-ab0d-e0782a1f13d1\") " pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.273338 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.273230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75b1346f-cd61-4f00-9522-07bd0a521fc6-installation-pull-secrets\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.273386 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.273333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75b1346f-cd61-4f00-9522-07bd0a521fc6-registry-tls\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.273386 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.273344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/75b1346f-cd61-4f00-9522-07bd0a521fc6-image-registry-private-configuration\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.273386 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.273378 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33d3f9c8-7acd-4da5-93e7-1274c864ad1c-cert\") pod \"ingress-canary-glzcp\" (UID: \"33d3f9c8-7acd-4da5-93e7-1274c864ad1c\") " pod="openshift-ingress-canary/ingress-canary-glzcp" Apr 16 18:10:53.292924 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.292892 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75b1346f-cd61-4f00-9522-07bd0a521fc6-bound-sa-token\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.293202 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.293184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h79n9\" (UniqueName: \"kubernetes.io/projected/206d1627-bacb-468e-ab0d-e0782a1f13d1-kube-api-access-h79n9\") pod \"dns-default-jxhl9\" (UID: \"206d1627-bacb-468e-ab0d-e0782a1f13d1\") " pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.293379 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.293212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcjx5\" (UniqueName: \"kubernetes.io/projected/75b1346f-cd61-4f00-9522-07bd0a521fc6-kube-api-access-pcjx5\") pod \"image-registry-f98dfdd99-ll2gn\" (UID: \"75b1346f-cd61-4f00-9522-07bd0a521fc6\") " pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.293449 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.293401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gwvs\" (UniqueName: \"kubernetes.io/projected/33d3f9c8-7acd-4da5-93e7-1274c864ad1c-kube-api-access-8gwvs\") pod \"ingress-canary-glzcp\" (UID: \"33d3f9c8-7acd-4da5-93e7-1274c864ad1c\") " pod="openshift-ingress-canary/ingress-canary-glzcp" Apr 16 18:10:53.370306 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.370200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-data-volume\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.370306 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.370262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.370306 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.370290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8s7xq\" (UniqueName: \"kubernetes.io/projected/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-kube-api-access-8s7xq\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.370306 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.370309 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-crio-socket\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.370550 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.370337 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.370550 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.370418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-crio-socket\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.370625 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.370601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-data-volume\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.370919 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.370887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.372595 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.372578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.378039 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.378018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s7xq\" (UniqueName: \"kubernetes.io/projected/c98a1bb4-edd0-485a-b6fc-87204ad0e0dc-kube-api-access-8s7xq\") pod \"insights-runtime-extractor-v2wwf\" (UID: \"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc\") " pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.419033 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.419005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:53.439217 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.439193 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-glzcp" Apr 16 18:10:53.458976 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.458943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:53.482224 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.482195 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v2wwf" Apr 16 18:10:53.637138 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.637087 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jxhl9"] Apr 16 18:10:53.640754 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.640707 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f98dfdd99-ll2gn"] Apr 16 18:10:53.641739 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:53.641714 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod206d1627_bacb_468e_ab0d_e0782a1f13d1.slice/crio-19800ebcdd177fe7322c024d69aa1aba16daafd934ed5a88ab97a8d9bce32845 WatchSource:0}: Error finding container 19800ebcdd177fe7322c024d69aa1aba16daafd934ed5a88ab97a8d9bce32845: Status 404 returned error can't find the container with id 19800ebcdd177fe7322c024d69aa1aba16daafd934ed5a88ab97a8d9bce32845 Apr 16 18:10:53.644658 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:53.644631 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75b1346f_cd61_4f00_9522_07bd0a521fc6.slice/crio-5c88584e35d2de850f31f81ad0351eea896de7bfc1f1b5ab6f7d0e4fa8b78f9d WatchSource:0}: Error finding container 5c88584e35d2de850f31f81ad0351eea896de7bfc1f1b5ab6f7d0e4fa8b78f9d: Status 404 returned error can't find the container with id 5c88584e35d2de850f31f81ad0351eea896de7bfc1f1b5ab6f7d0e4fa8b78f9d Apr 16 18:10:53.647696 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.647670 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-glzcp"] Apr 16 18:10:53.651752 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:53.651729 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33d3f9c8_7acd_4da5_93e7_1274c864ad1c.slice/crio-4b117d76c564213b9389378c360f62e76f39a9aca6366fce865df8f2ff5f20e3 WatchSource:0}: Error finding container 4b117d76c564213b9389378c360f62e76f39a9aca6366fce865df8f2ff5f20e3: Status 404 returned error can't find the container with id 4b117d76c564213b9389378c360f62e76f39a9aca6366fce865df8f2ff5f20e3 Apr 16 18:10:53.665890 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:53.665867 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v2wwf"] Apr 16 18:10:53.668458 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:10:53.668436 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc98a1bb4_edd0_485a_b6fc_87204ad0e0dc.slice/crio-cd7ea9c90a01d8aff951f5a2f6fa75b9d2f4b4b8725258a9fb12d86691576786 WatchSource:0}: Error finding container cd7ea9c90a01d8aff951f5a2f6fa75b9d2f4b4b8725258a9fb12d86691576786: Status 404 returned error can't find the container with id cd7ea9c90a01d8aff951f5a2f6fa75b9d2f4b4b8725258a9fb12d86691576786 Apr 16 18:10:54.076352 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.076314 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-glzcp" event={"ID":"33d3f9c8-7acd-4da5-93e7-1274c864ad1c","Type":"ContainerStarted","Data":"4b117d76c564213b9389378c360f62e76f39a9aca6366fce865df8f2ff5f20e3"} Apr 16 18:10:54.077878 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.077844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v2wwf" event={"ID":"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc","Type":"ContainerStarted","Data":"946ffc5cb2723b324625c4bf5bb49571f8bb1b0999f3d7b353b2cf6bf2ba2b24"} Apr 16 18:10:54.078009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.077884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v2wwf" event={"ID":"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc","Type":"ContainerStarted","Data":"cd7ea9c90a01d8aff951f5a2f6fa75b9d2f4b4b8725258a9fb12d86691576786"} Apr 16 18:10:54.079153 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.079120 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jxhl9" event={"ID":"206d1627-bacb-468e-ab0d-e0782a1f13d1","Type":"ContainerStarted","Data":"19800ebcdd177fe7322c024d69aa1aba16daafd934ed5a88ab97a8d9bce32845"} Apr 16 18:10:54.080559 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.080533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" event={"ID":"75b1346f-cd61-4f00-9522-07bd0a521fc6","Type":"ContainerStarted","Data":"a6382eecf11091ec43c05caa2eacb67543b0610c0a88220344528869b3b8e12c"} Apr 16 18:10:54.080664 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.080564 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" event={"ID":"75b1346f-cd61-4f00-9522-07bd0a521fc6","Type":"ContainerStarted","Data":"5c88584e35d2de850f31f81ad0351eea896de7bfc1f1b5ab6f7d0e4fa8b78f9d"} Apr 16 18:10:54.080728 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.080709 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:10:54.102440 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.100576 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" podStartSLOduration=2.100559686 podStartE2EDuration="2.100559686s" podCreationTimestamp="2026-04-16 18:10:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:54.099811047 +0000 UTC m=+54.803142596" watchObservedRunningTime="2026-04-16 18:10:54.100559686 +0000 UTC m=+54.803891213" Apr 16 18:10:54.894394 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.894359 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:10:54.894607 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.894359 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:10:54.894607 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.894364 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:10:54.896855 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.896833 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:10:54.897464 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.897442 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vqckd\"" Apr 16 18:10:54.897571 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.897474 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:10:54.897571 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.897532 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:10:54.897721 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.897701 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w4nfv\"" Apr 16 18:10:54.910267 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:54.910226 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:10:56.086384 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:56.086357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-glzcp" event={"ID":"33d3f9c8-7acd-4da5-93e7-1274c864ad1c","Type":"ContainerStarted","Data":"7fd0011e1b269d66b3848c4ebd033860d021d94e2e70d410dd3a97bfa1e71586"} Apr 16 18:10:56.089006 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:56.088834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v2wwf" event={"ID":"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc","Type":"ContainerStarted","Data":"4c21ad14096d0ccbb56d74cdabdf0701a5ab2bbddd7f5f26c905b6abaab6b38f"} Apr 16 18:10:56.090284 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:56.090233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jxhl9" event={"ID":"206d1627-bacb-468e-ab0d-e0782a1f13d1","Type":"ContainerStarted","Data":"b68793038129724367243907682441dc84e8f58f79853d5d9659945ae74f00d5"} Apr 16 18:10:56.101559 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:56.101517 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-glzcp" podStartSLOduration=0.834938665 podStartE2EDuration="3.101501893s" podCreationTimestamp="2026-04-16 18:10:53 +0000 UTC" firstStartedPulling="2026-04-16 18:10:53.653388785 +0000 UTC m=+54.356720291" lastFinishedPulling="2026-04-16 18:10:55.919952017 +0000 UTC m=+56.623283519" observedRunningTime="2026-04-16 18:10:56.100597908 +0000 UTC m=+56.803929433" watchObservedRunningTime="2026-04-16 18:10:56.101501893 +0000 UTC m=+56.804833417" Apr 16 18:10:57.094632 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:57.094583 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jxhl9" event={"ID":"206d1627-bacb-468e-ab0d-e0782a1f13d1","Type":"ContainerStarted","Data":"98081d88882a9b1e3a8173968a367ce306d2192701c3a4875768611f153487bc"} Apr 16 18:10:58.031531 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:58.031508 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zkbc8" Apr 16 18:10:58.056159 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:58.056122 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jxhl9" podStartSLOduration=2.795453334 podStartE2EDuration="5.056108882s" podCreationTimestamp="2026-04-16 18:10:53 +0000 UTC" firstStartedPulling="2026-04-16 18:10:53.643931407 +0000 UTC m=+54.347262909" lastFinishedPulling="2026-04-16 18:10:55.904586952 +0000 UTC m=+56.607918457" observedRunningTime="2026-04-16 18:10:57.11172473 +0000 UTC m=+57.815056254" watchObservedRunningTime="2026-04-16 18:10:58.056108882 +0000 UTC m=+58.759440405" Apr 16 18:10:58.099220 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:58.099185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v2wwf" event={"ID":"c98a1bb4-edd0-485a-b6fc-87204ad0e0dc","Type":"ContainerStarted","Data":"2a062c389354c1d57dad5fcf02f1d6d85e8ca5bc9dd6b01847ea628e345ed9a9"} Apr 16 18:10:58.099567 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:58.099340 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jxhl9" Apr 16 18:10:58.116229 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:10:58.116139 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-v2wwf" podStartSLOduration=1.068117339 podStartE2EDuration="5.116124502s" podCreationTimestamp="2026-04-16 18:10:53 +0000 UTC" firstStartedPulling="2026-04-16 18:10:53.807500906 +0000 UTC m=+54.510832407" lastFinishedPulling="2026-04-16 18:10:57.855508065 +0000 UTC m=+58.558839570" observedRunningTime="2026-04-16 18:10:58.115520909 +0000 UTC m=+58.818852434" watchObservedRunningTime="2026-04-16 18:10:58.116124502 +0000 UTC m=+58.819456026" Apr 16 18:11:01.145467 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.145433 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zg2cq"] Apr 16 18:11:01.150446 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.150424 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.152706 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.152682 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:11:01.152706 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.152698 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:11:01.152989 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.152753 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:11:01.152989 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.152783 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:11:01.152989 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.152972 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:11:01.153189 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.153176 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:11:01.153653 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.153637 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rnk47\"" Apr 16 18:11:01.159611 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.159591 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-dwchl"] Apr 16 18:11:01.162538 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.162524 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.164667 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.164651 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-f6mtf\"" Apr 16 18:11:01.164747 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.164713 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:11:01.164747 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.164724 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 18:11:01.164847 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.164823 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 18:11:01.172218 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.172196 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-dwchl"] Apr 16 18:11:01.227213 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.227370 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-root\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.227370 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.227370 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzsvq\" (UniqueName: \"kubernetes.io/projected/343458e2-2056-4ac4-9044-aacda978bc94-kube-api-access-dzsvq\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.227516 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/343458e2-2056-4ac4-9044-aacda978bc94-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.227516 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.227516 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227488 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-sys\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.227516 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227508 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-wtmp\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.227631 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-tls\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.227631 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227589 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-metrics-client-ca\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.227631 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227624 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-textfile\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.227745 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227639 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f4vr\" (UniqueName: \"kubernetes.io/projected/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-kube-api-access-6f4vr\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.227745 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.227745 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-accelerators-collector-config\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.227745 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.227711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/343458e2-2056-4ac4-9044-aacda978bc94-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.328592 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.328557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-accelerators-collector-config\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.328592 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.328593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/343458e2-2056-4ac4-9044-aacda978bc94-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.328778 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.328628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.328778 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:11:01.328724 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 18:11:01.328778 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.328773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-root\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.328921 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:11:01.328787 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-tls podName:343458e2-2056-4ac4-9044-aacda978bc94 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:01.82877102 +0000 UTC m=+62.532102522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-dwchl" (UID: "343458e2-2056-4ac4-9044-aacda978bc94") : secret "kube-state-metrics-tls" not found Apr 16 18:11:01.328921 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.328732 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-root\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.328921 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.328852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.328921 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.328891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzsvq\" (UniqueName: \"kubernetes.io/projected/343458e2-2056-4ac4-9044-aacda978bc94-kube-api-access-dzsvq\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.329119 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.328949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/343458e2-2056-4ac4-9044-aacda978bc94-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.329119 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329001 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.329119 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329010 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/343458e2-2056-4ac4-9044-aacda978bc94-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.329119 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-sys\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.329119 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329064 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-wtmp\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.329437 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-tls\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.329437 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-metrics-client-ca\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.329437 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-sys\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.329437 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-textfile\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.329437 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329218 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6f4vr\" (UniqueName: \"kubernetes.io/projected/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-kube-api-access-6f4vr\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.329437 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329257 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-wtmp\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.329437 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-accelerators-collector-config\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.329437 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.329797 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-textfile\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.329797 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.329554 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.330054 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.330031 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/343458e2-2056-4ac4-9044-aacda978bc94-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.330089 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.330056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-metrics-client-ca\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.331650 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.331616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.331932 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.331910 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.332012 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.331919 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-node-exporter-tls\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.339839 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.339817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f4vr\" (UniqueName: \"kubernetes.io/projected/5cb51374-d2cd-4fec-8477-88d4c3d6d74a-kube-api-access-6f4vr\") pod \"node-exporter-zg2cq\" (UID: \"5cb51374-d2cd-4fec-8477-88d4c3d6d74a\") " pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.340222 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.340199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzsvq\" (UniqueName: \"kubernetes.io/projected/343458e2-2056-4ac4-9044-aacda978bc94-kube-api-access-dzsvq\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.460000 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.459922 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zg2cq" Apr 16 18:11:01.467385 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:11:01.467353 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cb51374_d2cd_4fec_8477_88d4c3d6d74a.slice/crio-fdd402172aa3f4f0e56948b789e09cbb5a9f7f4be54ed6f758cde4bc8b8b0faf WatchSource:0}: Error finding container fdd402172aa3f4f0e56948b789e09cbb5a9f7f4be54ed6f758cde4bc8b8b0faf: Status 404 returned error can't find the container with id fdd402172aa3f4f0e56948b789e09cbb5a9f7f4be54ed6f758cde4bc8b8b0faf Apr 16 18:11:01.834173 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:01.834057 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:01.834360 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:11:01.834218 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 18:11:01.834360 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:11:01.834313 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-tls podName:343458e2-2056-4ac4-9044-aacda978bc94 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:02.834295989 +0000 UTC m=+63.537627490 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-dwchl" (UID: "343458e2-2056-4ac4-9044-aacda978bc94") : secret "kube-state-metrics-tls" not found Apr 16 18:11:02.111054 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:02.110971 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zg2cq" event={"ID":"5cb51374-d2cd-4fec-8477-88d4c3d6d74a","Type":"ContainerStarted","Data":"fdd402172aa3f4f0e56948b789e09cbb5a9f7f4be54ed6f758cde4bc8b8b0faf"} Apr 16 18:11:02.841020 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:02.840920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:02.843465 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:02.843440 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/343458e2-2056-4ac4-9044-aacda978bc94-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-dwchl\" (UID: \"343458e2-2056-4ac4-9044-aacda978bc94\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:02.974702 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:02.974655 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" Apr 16 18:11:03.088073 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:03.088041 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-dwchl"] Apr 16 18:11:03.091071 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:11:03.091023 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod343458e2_2056_4ac4_9044_aacda978bc94.slice/crio-fd4f8d4859cadb75747bd0118a6a911ed7756b0ec7209dad14d16e13291c9820 WatchSource:0}: Error finding container fd4f8d4859cadb75747bd0118a6a911ed7756b0ec7209dad14d16e13291c9820: Status 404 returned error can't find the container with id fd4f8d4859cadb75747bd0118a6a911ed7756b0ec7209dad14d16e13291c9820 Apr 16 18:11:03.114591 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:03.114559 2576 generic.go:358] "Generic (PLEG): container finished" podID="5cb51374-d2cd-4fec-8477-88d4c3d6d74a" containerID="454a3453b71ec6647b49aa3f583ac45564c792b4784639e41297ebe154e07a39" exitCode=0 Apr 16 18:11:03.114711 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:03.114632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zg2cq" event={"ID":"5cb51374-d2cd-4fec-8477-88d4c3d6d74a","Type":"ContainerDied","Data":"454a3453b71ec6647b49aa3f583ac45564c792b4784639e41297ebe154e07a39"} Apr 16 18:11:03.115736 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:03.115712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" event={"ID":"343458e2-2056-4ac4-9044-aacda978bc94","Type":"ContainerStarted","Data":"fd4f8d4859cadb75747bd0118a6a911ed7756b0ec7209dad14d16e13291c9820"} Apr 16 18:11:04.120789 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.120748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zg2cq" event={"ID":"5cb51374-d2cd-4fec-8477-88d4c3d6d74a","Type":"ContainerStarted","Data":"a65e0027991919a96b57fe5a150f07aad3e2aea786fcfc047e11445d21f23780"} Apr 16 18:11:04.121223 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.120797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zg2cq" event={"ID":"5cb51374-d2cd-4fec-8477-88d4c3d6d74a","Type":"ContainerStarted","Data":"e20770adbeb7850d98ba07daca07c4388f025eeea587f60ceb0714076455dfbd"} Apr 16 18:11:04.141942 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.141898 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zg2cq" podStartSLOduration=2.349939154 podStartE2EDuration="3.141880191s" podCreationTimestamp="2026-04-16 18:11:01 +0000 UTC" firstStartedPulling="2026-04-16 18:11:01.469199911 +0000 UTC m=+62.172531413" lastFinishedPulling="2026-04-16 18:11:02.261140936 +0000 UTC m=+62.964472450" observedRunningTime="2026-04-16 18:11:04.140315856 +0000 UTC m=+64.843647381" watchObservedRunningTime="2026-04-16 18:11:04.141880191 +0000 UTC m=+64.845211716" Apr 16 18:11:04.210919 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.210883 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7f49994ccb-rb9jz"] Apr 16 18:11:04.215009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.214984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.217163 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.217139 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 18:11:04.217299 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.217201 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-4albjudld7gul\"" Apr 16 18:11:04.217541 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.217520 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-xqhmr\"" Apr 16 18:11:04.217626 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.217612 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 18:11:04.217815 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.217792 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 18:11:04.217902 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.217833 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 18:11:04.217951 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.217934 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 18:11:04.224850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.224806 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f49994ccb-rb9jz"] Apr 16 18:11:04.252123 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.252093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.252297 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.252135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.252297 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.252169 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.252297 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.252220 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.252414 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.252337 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt9tl\" (UniqueName: \"kubernetes.io/projected/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-kube-api-access-tt9tl\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.252414 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.252376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-tls\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.252505 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.252449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-metrics-client-ca\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.252505 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.252476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-grpc-tls\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.353505 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.353455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.353505 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.353506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.353778 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.353528 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.353778 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.353558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.353778 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.353591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tt9tl\" (UniqueName: \"kubernetes.io/projected/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-kube-api-access-tt9tl\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.353778 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.353619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-tls\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.353778 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.353754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-metrics-client-ca\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.354001 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.353803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-grpc-tls\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.354566 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.354480 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-metrics-client-ca\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.356608 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.356583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.356714 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.356665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.356714 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.356685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-grpc-tls\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.356714 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.356689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.356864 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.356742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-tls\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.356913 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.356892 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.362162 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.362142 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt9tl\" (UniqueName: \"kubernetes.io/projected/252e09bb-e8cc-4008-9e5a-8c49ae1beec9-kube-api-access-tt9tl\") pod \"thanos-querier-7f49994ccb-rb9jz\" (UID: \"252e09bb-e8cc-4008-9e5a-8c49ae1beec9\") " pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.526043 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.526012 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:04.664442 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:04.664353 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f49994ccb-rb9jz"] Apr 16 18:11:04.667757 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:11:04.667717 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod252e09bb_e8cc_4008_9e5a_8c49ae1beec9.slice/crio-f2e027017772996b92bd83671db3e453d4256b79425b745f2a3b53cba4ead460 WatchSource:0}: Error finding container f2e027017772996b92bd83671db3e453d4256b79425b745f2a3b53cba4ead460: Status 404 returned error can't find the container with id f2e027017772996b92bd83671db3e453d4256b79425b745f2a3b53cba4ead460 Apr 16 18:11:05.125555 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.125529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" event={"ID":"343458e2-2056-4ac4-9044-aacda978bc94","Type":"ContainerStarted","Data":"bdf0ab9437e97bf0fced17aac5ab32c0f877dd4a3bf6126e5bdd528a121f3663"} Apr 16 18:11:05.126054 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.125564 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" event={"ID":"343458e2-2056-4ac4-9044-aacda978bc94","Type":"ContainerStarted","Data":"a6d92e4ef53369a20a99ae28e43fb73542b0fdd13caf777380102563a6acb12b"} Apr 16 18:11:05.126054 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.125579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" event={"ID":"343458e2-2056-4ac4-9044-aacda978bc94","Type":"ContainerStarted","Data":"fe1bd57e80180eede44cf10cd12ebe37989707bc8094206cf3f629667cfa2899"} Apr 16 18:11:05.126527 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.126508 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" event={"ID":"252e09bb-e8cc-4008-9e5a-8c49ae1beec9","Type":"ContainerStarted","Data":"f2e027017772996b92bd83671db3e453d4256b79425b745f2a3b53cba4ead460"} Apr 16 18:11:05.141309 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.141241 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-dwchl" podStartSLOduration=2.870420138 podStartE2EDuration="4.141228501s" podCreationTimestamp="2026-04-16 18:11:01 +0000 UTC" firstStartedPulling="2026-04-16 18:11:03.092848661 +0000 UTC m=+63.796180164" lastFinishedPulling="2026-04-16 18:11:04.36365701 +0000 UTC m=+65.066988527" observedRunningTime="2026-04-16 18:11:05.141220465 +0000 UTC m=+65.844551989" watchObservedRunningTime="2026-04-16 18:11:05.141228501 +0000 UTC m=+65.844560004" Apr 16 18:11:05.436675 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.436589 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-59f5c7d586-sdjkg"] Apr 16 18:11:05.439702 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.439681 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.441967 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.441945 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 18:11:05.442089 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.442004 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:11:05.442399 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.442323 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9biidaca4vksj\"" Apr 16 18:11:05.442399 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.442333 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 18:11:05.442399 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.442343 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 18:11:05.442677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.442662 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-kmtnn\"" Apr 16 18:11:05.448379 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.448348 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-59f5c7d586-sdjkg"] Apr 16 18:11:05.461853 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.461821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67sh4\" (UniqueName: \"kubernetes.io/projected/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-kube-api-access-67sh4\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.461978 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.461860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-client-ca-bundle\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.461978 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.461941 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-secret-metrics-server-client-certs\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.462075 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.462058 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-metrics-server-audit-profiles\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.462131 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.462118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-secret-metrics-server-tls\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.462190 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.462140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-audit-log\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.462190 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.462169 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.563392 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.563361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-metrics-server-audit-profiles\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.563392 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.563395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs\") pod \"network-metrics-daemon-687m2\" (UID: \"abb881c2-5bd6-4f73-a490-2665c9449ae7\") " pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:11:05.563639 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.563434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-secret-metrics-server-tls\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.563639 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.563459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-audit-log\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.563639 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.563489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.563639 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.563522 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:11:05.563639 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.563557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67sh4\" (UniqueName: \"kubernetes.io/projected/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-kube-api-access-67sh4\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.563639 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.563583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-client-ca-bundle\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.563993 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.563827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-secret-metrics-server-client-certs\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.564276 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.564214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-audit-log\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.564604 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.564580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-metrics-server-audit-profiles\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.564741 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.564635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.566043 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.566021 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:11:05.566375 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.566355 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:11:05.566582 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.566562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-client-ca-bundle\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.566667 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.566642 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-secret-metrics-server-tls\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.567026 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.566994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-secret-metrics-server-client-certs\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.571493 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.571467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67sh4\" (UniqueName: \"kubernetes.io/projected/ec18e6e3-88bf-44f7-bf1d-bb7bad347114-kube-api-access-67sh4\") pod \"metrics-server-59f5c7d586-sdjkg\" (UID: \"ec18e6e3-88bf-44f7-bf1d-bb7bad347114\") " pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.576254 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.576221 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abb881c2-5bd6-4f73-a490-2665c9449ae7-metrics-certs\") pod \"network-metrics-daemon-687m2\" (UID: \"abb881c2-5bd6-4f73-a490-2665c9449ae7\") " pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:11:05.576397 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.576380 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b097be47-4dfd-4fcd-a8ab-78a2cd491538-original-pull-secret\") pod \"global-pull-secret-syncer-slcjb\" (UID: \"b097be47-4dfd-4fcd-a8ab-78a2cd491538\") " pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:11:05.664811 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.664774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqz7\" (UniqueName: \"kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7\") pod \"network-check-target-qhbm6\" (UID: \"795df744-5070-4941-a2b6-f01fc85241b9\") " pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:11:05.667486 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.667457 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:11:05.677615 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.677588 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:11:05.688753 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.688690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksqz7\" (UniqueName: \"kubernetes.io/projected/795df744-5070-4941-a2b6-f01fc85241b9-kube-api-access-ksqz7\") pod \"network-check-target-qhbm6\" (UID: \"795df744-5070-4941-a2b6-f01fc85241b9\") " pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:11:05.708525 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.708502 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vqckd\"" Apr 16 18:11:05.715316 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.715288 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w4nfv\"" Apr 16 18:11:05.716900 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.716880 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-687m2" Apr 16 18:11:05.719620 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.719598 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-slcjb" Apr 16 18:11:05.723258 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.723230 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:11:05.755738 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.755537 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:05.870402 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.870379 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-687m2"] Apr 16 18:11:05.874706 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:11:05.874393 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabb881c2_5bd6_4f73_a490_2665c9449ae7.slice/crio-8fbf32134133650870e7d5f6d80ea48525e24f927890371f9c33e84334e014a4 WatchSource:0}: Error finding container 8fbf32134133650870e7d5f6d80ea48525e24f927890371f9c33e84334e014a4: Status 404 returned error can't find the container with id 8fbf32134133650870e7d5f6d80ea48525e24f927890371f9c33e84334e014a4 Apr 16 18:11:05.916693 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.916666 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-nd6pb"] Apr 16 18:11:05.921678 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.921653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nd6pb" Apr 16 18:11:05.923743 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.923719 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-54cc8\"" Apr 16 18:11:05.923892 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.923823 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 18:11:05.927850 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.927820 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-nd6pb"] Apr 16 18:11:05.931536 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.931495 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-59f5c7d586-sdjkg"] Apr 16 18:11:05.934964 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:11:05.934938 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec18e6e3_88bf_44f7_bf1d_bb7bad347114.slice/crio-d4721b7155efe95d54d314941ff39b1a813f6c8f61faa2b1adfd66bb30791c0f WatchSource:0}: Error finding container d4721b7155efe95d54d314941ff39b1a813f6c8f61faa2b1adfd66bb30791c0f: Status 404 returned error can't find the container with id d4721b7155efe95d54d314941ff39b1a813f6c8f61faa2b1adfd66bb30791c0f Apr 16 18:11:05.968149 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:05.968078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bf022871-f563-464d-9350-a198a2295c7f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-nd6pb\" (UID: \"bf022871-f563-464d-9350-a198a2295c7f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nd6pb" Apr 16 18:11:06.069476 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:06.069443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bf022871-f563-464d-9350-a198a2295c7f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-nd6pb\" (UID: \"bf022871-f563-464d-9350-a198a2295c7f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nd6pb" Apr 16 18:11:06.072609 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:06.072583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bf022871-f563-464d-9350-a198a2295c7f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-nd6pb\" (UID: \"bf022871-f563-464d-9350-a198a2295c7f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nd6pb" Apr 16 18:11:06.096012 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:06.095981 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-slcjb"] Apr 16 18:11:06.097147 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:06.097126 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qhbm6"] Apr 16 18:11:06.131056 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:06.131021 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" event={"ID":"ec18e6e3-88bf-44f7-bf1d-bb7bad347114","Type":"ContainerStarted","Data":"d4721b7155efe95d54d314941ff39b1a813f6c8f61faa2b1adfd66bb30791c0f"} Apr 16 18:11:06.132175 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:06.132146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-687m2" event={"ID":"abb881c2-5bd6-4f73-a490-2665c9449ae7","Type":"ContainerStarted","Data":"8fbf32134133650870e7d5f6d80ea48525e24f927890371f9c33e84334e014a4"} Apr 16 18:11:06.234818 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:06.234737 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nd6pb" Apr 16 18:11:06.484385 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:11:06.484350 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod795df744_5070_4941_a2b6_f01fc85241b9.slice/crio-9a0e1045391941e49241a21b232e661db6768450c923b8115e1edd47a99b2a46 WatchSource:0}: Error finding container 9a0e1045391941e49241a21b232e661db6768450c923b8115e1edd47a99b2a46: Status 404 returned error can't find the container with id 9a0e1045391941e49241a21b232e661db6768450c923b8115e1edd47a99b2a46 Apr 16 18:11:06.489711 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:11:06.489686 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb097be47_4dfd_4fcd_a8ab_78a2cd491538.slice/crio-d086426ecb1ba4c5a5bbf945cb2f014d35c3cb7006319d8e6bdcb4cc750b92f2 WatchSource:0}: Error finding container d086426ecb1ba4c5a5bbf945cb2f014d35c3cb7006319d8e6bdcb4cc750b92f2: Status 404 returned error can't find the container with id d086426ecb1ba4c5a5bbf945cb2f014d35c3cb7006319d8e6bdcb4cc750b92f2 Apr 16 18:11:06.613499 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:06.613453 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-nd6pb"] Apr 16 18:11:06.618136 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:11:06.618107 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf022871_f563_464d_9350_a198a2295c7f.slice/crio-4c11a6783aae6356368f58296d485ef2690a252fe2db2d8288c5c52ae2aaa5e1 WatchSource:0}: Error finding container 4c11a6783aae6356368f58296d485ef2690a252fe2db2d8288c5c52ae2aaa5e1: Status 404 returned error can't find the container with id 4c11a6783aae6356368f58296d485ef2690a252fe2db2d8288c5c52ae2aaa5e1 Apr 16 18:11:07.138028 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:07.137988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-slcjb" event={"ID":"b097be47-4dfd-4fcd-a8ab-78a2cd491538","Type":"ContainerStarted","Data":"d086426ecb1ba4c5a5bbf945cb2f014d35c3cb7006319d8e6bdcb4cc750b92f2"} Apr 16 18:11:07.141556 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:07.141301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" event={"ID":"252e09bb-e8cc-4008-9e5a-8c49ae1beec9","Type":"ContainerStarted","Data":"51b8fb5558965539e2379dc762317cbc3b90816888153fa7b9c7f559d3586b19"} Apr 16 18:11:07.141556 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:07.141336 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" event={"ID":"252e09bb-e8cc-4008-9e5a-8c49ae1beec9","Type":"ContainerStarted","Data":"8be09307714730b4336d4db3ea147f17ccf5378d915d1173b77ee14c10dd3c8c"} Apr 16 18:11:07.141556 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:07.141349 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" event={"ID":"252e09bb-e8cc-4008-9e5a-8c49ae1beec9","Type":"ContainerStarted","Data":"7c73fbd874006f1055c6bc02bd3d4dd7f1ea51afe2191cffa44a91b01b7bb4fe"} Apr 16 18:11:07.142773 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:07.142719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nd6pb" event={"ID":"bf022871-f563-464d-9350-a198a2295c7f","Type":"ContainerStarted","Data":"4c11a6783aae6356368f58296d485ef2690a252fe2db2d8288c5c52ae2aaa5e1"} Apr 16 18:11:07.144262 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:07.144212 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qhbm6" event={"ID":"795df744-5070-4941-a2b6-f01fc85241b9","Type":"ContainerStarted","Data":"9a0e1045391941e49241a21b232e661db6768450c923b8115e1edd47a99b2a46"} Apr 16 18:11:08.105205 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.104482 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jxhl9" Apr 16 18:11:08.155594 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.154849 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" event={"ID":"ec18e6e3-88bf-44f7-bf1d-bb7bad347114","Type":"ContainerStarted","Data":"0711473dd052d51b107fff866eee6ba3657a0d62beda93a1b37131ca7470823e"} Apr 16 18:11:08.171471 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.170524 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" podStartSLOduration=1.075410004 podStartE2EDuration="3.170505566s" podCreationTimestamp="2026-04-16 18:11:05 +0000 UTC" firstStartedPulling="2026-04-16 18:11:05.937093065 +0000 UTC m=+66.640424572" lastFinishedPulling="2026-04-16 18:11:08.032188624 +0000 UTC m=+68.735520134" observedRunningTime="2026-04-16 18:11:08.170059169 +0000 UTC m=+68.873390695" watchObservedRunningTime="2026-04-16 18:11:08.170505566 +0000 UTC m=+68.873837093" Apr 16 18:11:08.882452 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.881028 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c6f89dfd-mrbnf"] Apr 16 18:11:08.893717 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.892684 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:08.894693 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.894650 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6f89dfd-mrbnf"] Apr 16 18:11:08.898652 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.895901 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:11:08.898652 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.896645 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:11:08.898652 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.896801 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:11:08.898652 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.896978 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:11:08.898652 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.897164 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-szppp\"" Apr 16 18:11:08.898652 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.897365 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:11:08.898652 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.897969 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:11:08.898652 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.898161 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:11:08.902355 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.902336 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:11:08.997434 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.997398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-config\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:08.997613 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.997450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-trusted-ca-bundle\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:08.997613 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.997566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-serving-cert\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:08.997709 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.997620 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-oauth-config\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:08.997709 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.997690 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-oauth-serving-cert\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:08.997817 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.997733 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-service-ca\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:08.997817 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:08.997769 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phz9w\" (UniqueName: \"kubernetes.io/projected/f76da9d7-3d0f-4052-b4bf-07c2a88271af-kube-api-access-phz9w\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.098754 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.098715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-config\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.098960 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.098769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-trusted-ca-bundle\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.098960 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.098832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-serving-cert\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.098960 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.098866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-oauth-config\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.098960 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.098898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-oauth-serving-cert\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.098960 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.098944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-service-ca\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.099282 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.098978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phz9w\" (UniqueName: \"kubernetes.io/projected/f76da9d7-3d0f-4052-b4bf-07c2a88271af-kube-api-access-phz9w\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.100148 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.100111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-oauth-serving-cert\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.100755 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.100712 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-trusted-ca-bundle\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.100993 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.100954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-service-ca\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.101233 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.101206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-config\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.102114 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.102089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-oauth-config\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.102311 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.102291 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-serving-cert\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.106815 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.106797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phz9w\" (UniqueName: \"kubernetes.io/projected/f76da9d7-3d0f-4052-b4bf-07c2a88271af-kube-api-access-phz9w\") pod \"console-5c6f89dfd-mrbnf\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.160287 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.160186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-687m2" event={"ID":"abb881c2-5bd6-4f73-a490-2665c9449ae7","Type":"ContainerStarted","Data":"a7c530bf0a2136fb3a9851b7e2c06ac8c5fe9c82fc3c9e3e56015eb9a62794bd"} Apr 16 18:11:09.160287 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.160236 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-687m2" event={"ID":"abb881c2-5bd6-4f73-a490-2665c9449ae7","Type":"ContainerStarted","Data":"0824355a72438e65d423d95e2ed9246b54f5437ae489e94da688b86c2002cc18"} Apr 16 18:11:09.210603 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.210456 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:09.923272 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:09.923198 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-687m2" podStartSLOduration=67.770998815 podStartE2EDuration="1m9.923182308s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:11:05.877444444 +0000 UTC m=+66.580775955" lastFinishedPulling="2026-04-16 18:11:08.029627937 +0000 UTC m=+68.732959448" observedRunningTime="2026-04-16 18:11:09.176011253 +0000 UTC m=+69.879342776" watchObservedRunningTime="2026-04-16 18:11:09.923182308 +0000 UTC m=+70.626513832" Apr 16 18:11:12.162011 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:12.161984 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6f89dfd-mrbnf"] Apr 16 18:11:12.166338 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:11:12.166301 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf76da9d7_3d0f_4052_b4bf_07c2a88271af.slice/crio-af5867861599f817a00902c5c37bacaff2f37cd46a444781c91dbf1a3484b1e0 WatchSource:0}: Error finding container af5867861599f817a00902c5c37bacaff2f37cd46a444781c91dbf1a3484b1e0: Status 404 returned error can't find the container with id af5867861599f817a00902c5c37bacaff2f37cd46a444781c91dbf1a3484b1e0 Apr 16 18:11:12.176484 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:12.176455 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" event={"ID":"252e09bb-e8cc-4008-9e5a-8c49ae1beec9","Type":"ContainerStarted","Data":"5e2d48a10d07f7e1c95e6a9995f5e918301a72c1a7457f7ee73ca9da7e842913"} Apr 16 18:11:12.179469 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:12.179432 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nd6pb" event={"ID":"bf022871-f563-464d-9350-a198a2295c7f","Type":"ContainerStarted","Data":"508f825c9d85bc6fcc149c06bdafaa8f5cdc69c77da8f120078093e3b0f451dc"} Apr 16 18:11:12.179660 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:12.179616 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nd6pb" Apr 16 18:11:12.183453 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:12.183405 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qhbm6" event={"ID":"795df744-5070-4941-a2b6-f01fc85241b9","Type":"ContainerStarted","Data":"a80c2fd3792eb1cec884226125891b66d275ecaba17febdd161631cce7f1d1f0"} Apr 16 18:11:12.183637 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:12.183609 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:11:12.185537 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:12.185474 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nd6pb" Apr 16 18:11:12.185798 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:12.185768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6f89dfd-mrbnf" event={"ID":"f76da9d7-3d0f-4052-b4bf-07c2a88271af","Type":"ContainerStarted","Data":"af5867861599f817a00902c5c37bacaff2f37cd46a444781c91dbf1a3484b1e0"} Apr 16 18:11:12.194744 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:12.194689 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nd6pb" podStartSLOduration=1.785962625 podStartE2EDuration="7.194677936s" podCreationTimestamp="2026-04-16 18:11:05 +0000 UTC" firstStartedPulling="2026-04-16 18:11:06.621208023 +0000 UTC m=+67.324539539" lastFinishedPulling="2026-04-16 18:11:12.029923348 +0000 UTC m=+72.733254850" observedRunningTime="2026-04-16 18:11:12.193315832 +0000 UTC m=+72.896647354" watchObservedRunningTime="2026-04-16 18:11:12.194677936 +0000 UTC m=+72.898009459" Apr 16 18:11:12.223217 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:12.223155 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qhbm6" podStartSLOduration=66.669971706 podStartE2EDuration="1m12.223137152s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:11:06.486098015 +0000 UTC m=+67.189429517" lastFinishedPulling="2026-04-16 18:11:12.03926346 +0000 UTC m=+72.742594963" observedRunningTime="2026-04-16 18:11:12.206987967 +0000 UTC m=+72.910319491" watchObservedRunningTime="2026-04-16 18:11:12.223137152 +0000 UTC m=+72.926468676" Apr 16 18:11:13.192093 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:13.191973 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" event={"ID":"252e09bb-e8cc-4008-9e5a-8c49ae1beec9","Type":"ContainerStarted","Data":"be8c0e4051d643c128f0a7b3d7b5e4b255645d9b69da7bf0482735db8a9e769e"} Apr 16 18:11:13.192093 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:13.192020 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" event={"ID":"252e09bb-e8cc-4008-9e5a-8c49ae1beec9","Type":"ContainerStarted","Data":"8c8cb9fc1f0e7a59cf8adb125527d285f439104f49ffd6e71c22b10873921e61"} Apr 16 18:11:13.192529 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:13.192169 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:13.193539 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:13.193512 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-slcjb" event={"ID":"b097be47-4dfd-4fcd-a8ab-78a2cd491538","Type":"ContainerStarted","Data":"c490fe80a194148ebc95e8aa12793aed870127cc0760fa11944519896cf4791c"} Apr 16 18:11:13.198314 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:13.198296 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" Apr 16 18:11:13.214227 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:13.214176 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7f49994ccb-rb9jz" podStartSLOduration=1.852131209 podStartE2EDuration="9.214164027s" podCreationTimestamp="2026-04-16 18:11:04 +0000 UTC" firstStartedPulling="2026-04-16 18:11:04.669772425 +0000 UTC m=+65.373103927" lastFinishedPulling="2026-04-16 18:11:12.031805229 +0000 UTC m=+72.735136745" observedRunningTime="2026-04-16 18:11:13.212535107 +0000 UTC m=+73.915866642" watchObservedRunningTime="2026-04-16 18:11:13.214164027 +0000 UTC m=+73.917495551" Apr 16 18:11:13.250108 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:13.250054 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-slcjb" podStartSLOduration=67.597427993 podStartE2EDuration="1m13.250021835s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:11:06.492423938 +0000 UTC m=+67.195755441" lastFinishedPulling="2026-04-16 18:11:12.145017782 +0000 UTC m=+72.848349283" observedRunningTime="2026-04-16 18:11:13.248971665 +0000 UTC m=+73.952303190" watchObservedRunningTime="2026-04-16 18:11:13.250021835 +0000 UTC m=+73.953353363" Apr 16 18:11:15.086985 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:15.086956 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-f98dfdd99-ll2gn" Apr 16 18:11:16.211061 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:16.211021 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6f89dfd-mrbnf" event={"ID":"f76da9d7-3d0f-4052-b4bf-07c2a88271af","Type":"ContainerStarted","Data":"a6f04f3608492617fead644570772548d4eed926842a74fe72496faa4a61fd5a"} Apr 16 18:11:16.229743 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:16.229678 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c6f89dfd-mrbnf" podStartSLOduration=4.34950239 podStartE2EDuration="8.229662152s" podCreationTimestamp="2026-04-16 18:11:08 +0000 UTC" firstStartedPulling="2026-04-16 18:11:12.168422265 +0000 UTC m=+72.871753773" lastFinishedPulling="2026-04-16 18:11:16.048582033 +0000 UTC m=+76.751913535" observedRunningTime="2026-04-16 18:11:16.228977949 +0000 UTC m=+76.932309474" watchObservedRunningTime="2026-04-16 18:11:16.229662152 +0000 UTC m=+76.932993678" Apr 16 18:11:19.211081 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:19.211037 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:19.211557 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:19.211097 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:19.216051 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:19.216030 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:19.223008 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:19.222987 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:11:25.756491 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:25.756435 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:25.756491 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:25.756500 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:43.196366 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:43.196336 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qhbm6" Apr 16 18:11:45.760754 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:45.760719 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:11:45.764562 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:11:45.764540 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-59f5c7d586-sdjkg" Apr 16 18:12:29.237786 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.237750 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69b58bcf8b-qcn9b"] Apr 16 18:12:29.241164 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.241148 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.254099 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.254073 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b58bcf8b-qcn9b"] Apr 16 18:12:29.326859 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.326829 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-trusted-ca-bundle\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.326859 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.326864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de3eb80b-5cea-4449-b02e-3253c40201ba-console-serving-cert\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.327061 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.326889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-oauth-serving-cert\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.327061 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.326951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de3eb80b-5cea-4449-b02e-3253c40201ba-console-oauth-config\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.327061 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.327031 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-service-ca\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.327061 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.327055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-console-config\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.327185 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.327096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddkfq\" (UniqueName: \"kubernetes.io/projected/de3eb80b-5cea-4449-b02e-3253c40201ba-kube-api-access-ddkfq\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.427456 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.427427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-trusted-ca-bundle\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.427456 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.427462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de3eb80b-5cea-4449-b02e-3253c40201ba-console-serving-cert\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.427705 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.427482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-oauth-serving-cert\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.427705 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.427500 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de3eb80b-5cea-4449-b02e-3253c40201ba-console-oauth-config\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.427705 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.427527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-service-ca\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.427705 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.427544 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-console-config\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.427705 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.427569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddkfq\" (UniqueName: \"kubernetes.io/projected/de3eb80b-5cea-4449-b02e-3253c40201ba-kube-api-access-ddkfq\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.428288 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.428234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-oauth-serving-cert\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.428421 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.428360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-service-ca\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.428421 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.428393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-console-config\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.428748 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.428727 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-trusted-ca-bundle\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.429887 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.429865 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de3eb80b-5cea-4449-b02e-3253c40201ba-console-oauth-config\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.430040 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.430023 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de3eb80b-5cea-4449-b02e-3253c40201ba-console-serving-cert\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.437029 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.437003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddkfq\" (UniqueName: \"kubernetes.io/projected/de3eb80b-5cea-4449-b02e-3253c40201ba-kube-api-access-ddkfq\") pod \"console-69b58bcf8b-qcn9b\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.549282 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.549176 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:29.670198 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:29.670170 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b58bcf8b-qcn9b"] Apr 16 18:12:29.672535 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:12:29.672512 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde3eb80b_5cea_4449_b02e_3253c40201ba.slice/crio-886dc698a0bb47f495387a77eb4fb821e6a3c94b73005f90c81a671415b7d335 WatchSource:0}: Error finding container 886dc698a0bb47f495387a77eb4fb821e6a3c94b73005f90c81a671415b7d335: Status 404 returned error can't find the container with id 886dc698a0bb47f495387a77eb4fb821e6a3c94b73005f90c81a671415b7d335 Apr 16 18:12:30.425978 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:30.425940 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b58bcf8b-qcn9b" event={"ID":"de3eb80b-5cea-4449-b02e-3253c40201ba","Type":"ContainerStarted","Data":"14ffee2eeedbd7b380ff9a2c32510fc0e461eabefed55d4d20a2668cf5ca3cf0"} Apr 16 18:12:30.425978 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:30.425979 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b58bcf8b-qcn9b" event={"ID":"de3eb80b-5cea-4449-b02e-3253c40201ba","Type":"ContainerStarted","Data":"886dc698a0bb47f495387a77eb4fb821e6a3c94b73005f90c81a671415b7d335"} Apr 16 18:12:30.444130 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:30.444084 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69b58bcf8b-qcn9b" podStartSLOduration=1.444067332 podStartE2EDuration="1.444067332s" podCreationTimestamp="2026-04-16 18:12:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:12:30.442915885 +0000 UTC m=+151.146247410" watchObservedRunningTime="2026-04-16 18:12:30.444067332 +0000 UTC m=+151.147398857" Apr 16 18:12:39.549453 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:39.549414 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:39.549840 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:39.549468 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:39.554165 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:39.554134 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:40.455440 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:40.455414 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:12:40.508538 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:12:40.508510 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c6f89dfd-mrbnf"] Apr 16 18:13:05.528189 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.528082 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c6f89dfd-mrbnf" podUID="f76da9d7-3d0f-4052-b4bf-07c2a88271af" containerName="console" containerID="cri-o://a6f04f3608492617fead644570772548d4eed926842a74fe72496faa4a61fd5a" gracePeriod=15 Apr 16 18:13:05.768923 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.768902 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c6f89dfd-mrbnf_f76da9d7-3d0f-4052-b4bf-07c2a88271af/console/0.log" Apr 16 18:13:05.769046 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.768965 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:13:05.790448 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.790356 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-trusted-ca-bundle\") pod \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " Apr 16 18:13:05.790448 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.790410 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-config\") pod \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " Apr 16 18:13:05.790643 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.790452 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-serving-cert\") pod \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " Apr 16 18:13:05.790846 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.790816 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f76da9d7-3d0f-4052-b4bf-07c2a88271af" (UID: "f76da9d7-3d0f-4052-b4bf-07c2a88271af"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:05.791081 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.791058 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-config" (OuterVolumeSpecName: "console-config") pod "f76da9d7-3d0f-4052-b4bf-07c2a88271af" (UID: "f76da9d7-3d0f-4052-b4bf-07c2a88271af"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:05.792822 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.792800 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f76da9d7-3d0f-4052-b4bf-07c2a88271af" (UID: "f76da9d7-3d0f-4052-b4bf-07c2a88271af"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:05.890916 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.890865 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-oauth-serving-cert\") pod \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " Apr 16 18:13:05.890916 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.890915 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-service-ca\") pod \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " Apr 16 18:13:05.890916 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.890934 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phz9w\" (UniqueName: \"kubernetes.io/projected/f76da9d7-3d0f-4052-b4bf-07c2a88271af-kube-api-access-phz9w\") pod \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " Apr 16 18:13:05.891170 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.890967 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-oauth-config\") pod \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\" (UID: \"f76da9d7-3d0f-4052-b4bf-07c2a88271af\") " Apr 16 18:13:05.891170 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.891118 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-serving-cert\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:13:05.891170 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.891133 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-trusted-ca-bundle\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:13:05.891170 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.891147 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-config\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:13:05.891459 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.891208 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-service-ca" (OuterVolumeSpecName: "service-ca") pod "f76da9d7-3d0f-4052-b4bf-07c2a88271af" (UID: "f76da9d7-3d0f-4052-b4bf-07c2a88271af"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:05.891510 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.891452 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f76da9d7-3d0f-4052-b4bf-07c2a88271af" (UID: "f76da9d7-3d0f-4052-b4bf-07c2a88271af"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:13:05.893191 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.893169 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f76da9d7-3d0f-4052-b4bf-07c2a88271af" (UID: "f76da9d7-3d0f-4052-b4bf-07c2a88271af"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:13:05.893296 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.893170 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f76da9d7-3d0f-4052-b4bf-07c2a88271af-kube-api-access-phz9w" (OuterVolumeSpecName: "kube-api-access-phz9w") pod "f76da9d7-3d0f-4052-b4bf-07c2a88271af" (UID: "f76da9d7-3d0f-4052-b4bf-07c2a88271af"). InnerVolumeSpecName "kube-api-access-phz9w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:13:05.991455 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.991407 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-phz9w\" (UniqueName: \"kubernetes.io/projected/f76da9d7-3d0f-4052-b4bf-07c2a88271af-kube-api-access-phz9w\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:13:05.991455 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.991451 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f76da9d7-3d0f-4052-b4bf-07c2a88271af-console-oauth-config\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:13:05.991455 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.991461 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-oauth-serving-cert\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:13:05.991455 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:05.991470 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f76da9d7-3d0f-4052-b4bf-07c2a88271af-service-ca\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:13:06.525287 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:06.525239 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c6f89dfd-mrbnf_f76da9d7-3d0f-4052-b4bf-07c2a88271af/console/0.log" Apr 16 18:13:06.525453 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:06.525299 2576 generic.go:358] "Generic (PLEG): container finished" podID="f76da9d7-3d0f-4052-b4bf-07c2a88271af" containerID="a6f04f3608492617fead644570772548d4eed926842a74fe72496faa4a61fd5a" exitCode=2 Apr 16 18:13:06.525453 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:06.525334 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6f89dfd-mrbnf" event={"ID":"f76da9d7-3d0f-4052-b4bf-07c2a88271af","Type":"ContainerDied","Data":"a6f04f3608492617fead644570772548d4eed926842a74fe72496faa4a61fd5a"} Apr 16 18:13:06.525453 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:06.525365 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6f89dfd-mrbnf" Apr 16 18:13:06.525453 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:06.525379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6f89dfd-mrbnf" event={"ID":"f76da9d7-3d0f-4052-b4bf-07c2a88271af","Type":"ContainerDied","Data":"af5867861599f817a00902c5c37bacaff2f37cd46a444781c91dbf1a3484b1e0"} Apr 16 18:13:06.525453 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:06.525395 2576 scope.go:117] "RemoveContainer" containerID="a6f04f3608492617fead644570772548d4eed926842a74fe72496faa4a61fd5a" Apr 16 18:13:06.535381 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:06.535316 2576 scope.go:117] "RemoveContainer" containerID="a6f04f3608492617fead644570772548d4eed926842a74fe72496faa4a61fd5a" Apr 16 18:13:06.535713 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:13:06.535688 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6f04f3608492617fead644570772548d4eed926842a74fe72496faa4a61fd5a\": container with ID starting with a6f04f3608492617fead644570772548d4eed926842a74fe72496faa4a61fd5a not found: ID does not exist" containerID="a6f04f3608492617fead644570772548d4eed926842a74fe72496faa4a61fd5a" Apr 16 18:13:06.535763 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:06.535727 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6f04f3608492617fead644570772548d4eed926842a74fe72496faa4a61fd5a"} err="failed to get container status \"a6f04f3608492617fead644570772548d4eed926842a74fe72496faa4a61fd5a\": rpc error: code = NotFound desc = could not find container \"a6f04f3608492617fead644570772548d4eed926842a74fe72496faa4a61fd5a\": container with ID starting with a6f04f3608492617fead644570772548d4eed926842a74fe72496faa4a61fd5a not found: ID does not exist" Apr 16 18:13:06.541472 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:06.541447 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c6f89dfd-mrbnf"] Apr 16 18:13:06.544915 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:06.544895 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c6f89dfd-mrbnf"] Apr 16 18:13:07.898611 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:07.898576 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f76da9d7-3d0f-4052-b4bf-07c2a88271af" path="/var/lib/kubelet/pods/f76da9d7-3d0f-4052-b4bf-07c2a88271af/volumes" Apr 16 18:13:44.137684 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.137649 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d9f45f7f6-p679k"] Apr 16 18:13:44.138124 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.137900 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f76da9d7-3d0f-4052-b4bf-07c2a88271af" containerName="console" Apr 16 18:13:44.138124 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.137910 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76da9d7-3d0f-4052-b4bf-07c2a88271af" containerName="console" Apr 16 18:13:44.138124 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.137956 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f76da9d7-3d0f-4052-b4bf-07c2a88271af" containerName="console" Apr 16 18:13:44.140534 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.140516 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.150766 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.150744 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d9f45f7f6-p679k"] Apr 16 18:13:44.260305 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.260268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce34c05d-d3c5-4938-86fb-d746afebb402-console-oauth-config\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.260305 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.260308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-oauth-serving-cert\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.260513 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.260348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-service-ca\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.260513 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.260368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxhjt\" (UniqueName: \"kubernetes.io/projected/ce34c05d-d3c5-4938-86fb-d746afebb402-kube-api-access-mxhjt\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.260513 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.260413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-console-config\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.260513 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.260427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-trusted-ca-bundle\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.260513 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.260452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce34c05d-d3c5-4938-86fb-d746afebb402-console-serving-cert\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.361511 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.361477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-service-ca\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.361511 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.361516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxhjt\" (UniqueName: \"kubernetes.io/projected/ce34c05d-d3c5-4938-86fb-d746afebb402-kube-api-access-mxhjt\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.361774 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.361540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-console-config\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.361774 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.361556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-trusted-ca-bundle\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.361774 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.361579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce34c05d-d3c5-4938-86fb-d746afebb402-console-serving-cert\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.361774 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.361626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce34c05d-d3c5-4938-86fb-d746afebb402-console-oauth-config\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.361774 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.361646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-oauth-serving-cert\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.362367 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.362341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-console-config\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.362495 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.362402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-service-ca\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.362495 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.362477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-oauth-serving-cert\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.362685 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.362667 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-trusted-ca-bundle\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.364283 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.364234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce34c05d-d3c5-4938-86fb-d746afebb402-console-serving-cert\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.364374 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.364234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce34c05d-d3c5-4938-86fb-d746afebb402-console-oauth-config\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.368821 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.368800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxhjt\" (UniqueName: \"kubernetes.io/projected/ce34c05d-d3c5-4938-86fb-d746afebb402-kube-api-access-mxhjt\") pod \"console-7d9f45f7f6-p679k\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.449383 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.449301 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:44.564204 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.564173 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d9f45f7f6-p679k"] Apr 16 18:13:44.567458 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:13:44.567431 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce34c05d_d3c5_4938_86fb_d746afebb402.slice/crio-fdbeb964a7478c54dde84c635010e646ec8da4db230c01734e25fa720a53a6d6 WatchSource:0}: Error finding container fdbeb964a7478c54dde84c635010e646ec8da4db230c01734e25fa720a53a6d6: Status 404 returned error can't find the container with id fdbeb964a7478c54dde84c635010e646ec8da4db230c01734e25fa720a53a6d6 Apr 16 18:13:44.623356 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:44.623329 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d9f45f7f6-p679k" event={"ID":"ce34c05d-d3c5-4938-86fb-d746afebb402","Type":"ContainerStarted","Data":"fdbeb964a7478c54dde84c635010e646ec8da4db230c01734e25fa720a53a6d6"} Apr 16 18:13:45.627011 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:45.626976 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d9f45f7f6-p679k" event={"ID":"ce34c05d-d3c5-4938-86fb-d746afebb402","Type":"ContainerStarted","Data":"208986e650bd12036db751558f512fdec7f2a5beef7ba4407d32c9f00b51b59f"} Apr 16 18:13:45.643648 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:45.643600 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d9f45f7f6-p679k" podStartSLOduration=1.643587123 podStartE2EDuration="1.643587123s" podCreationTimestamp="2026-04-16 18:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:13:45.642556413 +0000 UTC m=+226.345887937" watchObservedRunningTime="2026-04-16 18:13:45.643587123 +0000 UTC m=+226.346918647" Apr 16 18:13:54.449540 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:54.449497 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:54.449540 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:54.449538 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:54.454340 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:54.454319 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:54.655377 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:54.655348 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:13:54.703585 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:13:54.703505 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69b58bcf8b-qcn9b"] Apr 16 18:14:19.727068 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:19.726975 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69b58bcf8b-qcn9b" podUID="de3eb80b-5cea-4449-b02e-3253c40201ba" containerName="console" containerID="cri-o://14ffee2eeedbd7b380ff9a2c32510fc0e461eabefed55d4d20a2668cf5ca3cf0" gracePeriod=15 Apr 16 18:14:19.957626 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:19.957605 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69b58bcf8b-qcn9b_de3eb80b-5cea-4449-b02e-3253c40201ba/console/0.log" Apr 16 18:14:19.957750 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:19.957661 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:14:20.049923 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.049839 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-oauth-serving-cert\") pod \"de3eb80b-5cea-4449-b02e-3253c40201ba\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " Apr 16 18:14:20.049923 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.049881 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-trusted-ca-bundle\") pod \"de3eb80b-5cea-4449-b02e-3253c40201ba\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " Apr 16 18:14:20.049923 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.049920 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de3eb80b-5cea-4449-b02e-3253c40201ba-console-oauth-config\") pod \"de3eb80b-5cea-4449-b02e-3253c40201ba\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " Apr 16 18:14:20.050186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.049945 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-console-config\") pod \"de3eb80b-5cea-4449-b02e-3253c40201ba\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " Apr 16 18:14:20.050186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.049970 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-service-ca\") pod \"de3eb80b-5cea-4449-b02e-3253c40201ba\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " Apr 16 18:14:20.050186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.049997 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de3eb80b-5cea-4449-b02e-3253c40201ba-console-serving-cert\") pod \"de3eb80b-5cea-4449-b02e-3253c40201ba\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " Apr 16 18:14:20.050186 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.050035 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddkfq\" (UniqueName: \"kubernetes.io/projected/de3eb80b-5cea-4449-b02e-3253c40201ba-kube-api-access-ddkfq\") pod \"de3eb80b-5cea-4449-b02e-3253c40201ba\" (UID: \"de3eb80b-5cea-4449-b02e-3253c40201ba\") " Apr 16 18:14:20.050386 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.050305 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "de3eb80b-5cea-4449-b02e-3253c40201ba" (UID: "de3eb80b-5cea-4449-b02e-3253c40201ba"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:20.050386 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.050343 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "de3eb80b-5cea-4449-b02e-3253c40201ba" (UID: "de3eb80b-5cea-4449-b02e-3253c40201ba"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:20.050689 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.050661 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-console-config" (OuterVolumeSpecName: "console-config") pod "de3eb80b-5cea-4449-b02e-3253c40201ba" (UID: "de3eb80b-5cea-4449-b02e-3253c40201ba"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:20.050689 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.050674 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-service-ca" (OuterVolumeSpecName: "service-ca") pod "de3eb80b-5cea-4449-b02e-3253c40201ba" (UID: "de3eb80b-5cea-4449-b02e-3253c40201ba"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:14:20.052370 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.052345 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3eb80b-5cea-4449-b02e-3253c40201ba-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "de3eb80b-5cea-4449-b02e-3253c40201ba" (UID: "de3eb80b-5cea-4449-b02e-3253c40201ba"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:14:20.052469 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.052375 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3eb80b-5cea-4449-b02e-3253c40201ba-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "de3eb80b-5cea-4449-b02e-3253c40201ba" (UID: "de3eb80b-5cea-4449-b02e-3253c40201ba"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:14:20.052469 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.052377 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de3eb80b-5cea-4449-b02e-3253c40201ba-kube-api-access-ddkfq" (OuterVolumeSpecName: "kube-api-access-ddkfq") pod "de3eb80b-5cea-4449-b02e-3253c40201ba" (UID: "de3eb80b-5cea-4449-b02e-3253c40201ba"). InnerVolumeSpecName "kube-api-access-ddkfq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:14:20.151627 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.151581 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-oauth-serving-cert\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:14:20.151627 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.151620 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-trusted-ca-bundle\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:14:20.151627 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.151635 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de3eb80b-5cea-4449-b02e-3253c40201ba-console-oauth-config\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:14:20.151859 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.151648 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-console-config\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:14:20.151859 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.151661 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de3eb80b-5cea-4449-b02e-3253c40201ba-service-ca\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:14:20.151859 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.151673 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de3eb80b-5cea-4449-b02e-3253c40201ba-console-serving-cert\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:14:20.151859 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.151684 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ddkfq\" (UniqueName: \"kubernetes.io/projected/de3eb80b-5cea-4449-b02e-3253c40201ba-kube-api-access-ddkfq\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:14:20.726369 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.726342 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69b58bcf8b-qcn9b_de3eb80b-5cea-4449-b02e-3253c40201ba/console/0.log" Apr 16 18:14:20.726572 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.726379 2576 generic.go:358] "Generic (PLEG): container finished" podID="de3eb80b-5cea-4449-b02e-3253c40201ba" containerID="14ffee2eeedbd7b380ff9a2c32510fc0e461eabefed55d4d20a2668cf5ca3cf0" exitCode=2 Apr 16 18:14:20.726572 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.726428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b58bcf8b-qcn9b" event={"ID":"de3eb80b-5cea-4449-b02e-3253c40201ba","Type":"ContainerDied","Data":"14ffee2eeedbd7b380ff9a2c32510fc0e461eabefed55d4d20a2668cf5ca3cf0"} Apr 16 18:14:20.726572 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.726451 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b58bcf8b-qcn9b" event={"ID":"de3eb80b-5cea-4449-b02e-3253c40201ba","Type":"ContainerDied","Data":"886dc698a0bb47f495387a77eb4fb821e6a3c94b73005f90c81a671415b7d335"} Apr 16 18:14:20.726572 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.726465 2576 scope.go:117] "RemoveContainer" containerID="14ffee2eeedbd7b380ff9a2c32510fc0e461eabefed55d4d20a2668cf5ca3cf0" Apr 16 18:14:20.726572 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.726465 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b58bcf8b-qcn9b" Apr 16 18:14:20.734338 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.733846 2576 scope.go:117] "RemoveContainer" containerID="14ffee2eeedbd7b380ff9a2c32510fc0e461eabefed55d4d20a2668cf5ca3cf0" Apr 16 18:14:20.734338 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:14:20.734270 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14ffee2eeedbd7b380ff9a2c32510fc0e461eabefed55d4d20a2668cf5ca3cf0\": container with ID starting with 14ffee2eeedbd7b380ff9a2c32510fc0e461eabefed55d4d20a2668cf5ca3cf0 not found: ID does not exist" containerID="14ffee2eeedbd7b380ff9a2c32510fc0e461eabefed55d4d20a2668cf5ca3cf0" Apr 16 18:14:20.734338 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.734304 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14ffee2eeedbd7b380ff9a2c32510fc0e461eabefed55d4d20a2668cf5ca3cf0"} err="failed to get container status \"14ffee2eeedbd7b380ff9a2c32510fc0e461eabefed55d4d20a2668cf5ca3cf0\": rpc error: code = NotFound desc = could not find container \"14ffee2eeedbd7b380ff9a2c32510fc0e461eabefed55d4d20a2668cf5ca3cf0\": container with ID starting with 14ffee2eeedbd7b380ff9a2c32510fc0e461eabefed55d4d20a2668cf5ca3cf0 not found: ID does not exist" Apr 16 18:14:20.745525 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.745502 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69b58bcf8b-qcn9b"] Apr 16 18:14:20.750709 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:20.750688 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69b58bcf8b-qcn9b"] Apr 16 18:14:21.897948 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:21.897913 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de3eb80b-5cea-4449-b02e-3253c40201ba" path="/var/lib/kubelet/pods/de3eb80b-5cea-4449-b02e-3253c40201ba/volumes" Apr 16 18:14:37.016788 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.016755 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n"] Apr 16 18:14:37.017189 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.017025 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de3eb80b-5cea-4449-b02e-3253c40201ba" containerName="console" Apr 16 18:14:37.017189 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.017037 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3eb80b-5cea-4449-b02e-3253c40201ba" containerName="console" Apr 16 18:14:37.017189 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.017097 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="de3eb80b-5cea-4449-b02e-3253c40201ba" containerName="console" Apr 16 18:14:37.020148 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.020131 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" Apr 16 18:14:37.022360 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.022337 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q26zn\"" Apr 16 18:14:37.022477 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.022340 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:14:37.022933 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.022919 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:14:37.026460 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.026434 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n"] Apr 16 18:14:37.074924 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.074881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1395867e-0af7-4170-8142-9bfc51e76412-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n\" (UID: \"1395867e-0af7-4170-8142-9bfc51e76412\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" Apr 16 18:14:37.075083 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.074937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1395867e-0af7-4170-8142-9bfc51e76412-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n\" (UID: \"1395867e-0af7-4170-8142-9bfc51e76412\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" Apr 16 18:14:37.075083 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.075004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc7tc\" (UniqueName: \"kubernetes.io/projected/1395867e-0af7-4170-8142-9bfc51e76412-kube-api-access-qc7tc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n\" (UID: \"1395867e-0af7-4170-8142-9bfc51e76412\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" Apr 16 18:14:37.175806 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.175768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qc7tc\" (UniqueName: \"kubernetes.io/projected/1395867e-0af7-4170-8142-9bfc51e76412-kube-api-access-qc7tc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n\" (UID: \"1395867e-0af7-4170-8142-9bfc51e76412\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" Apr 16 18:14:37.175960 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.175857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1395867e-0af7-4170-8142-9bfc51e76412-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n\" (UID: \"1395867e-0af7-4170-8142-9bfc51e76412\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" Apr 16 18:14:37.175960 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.175892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1395867e-0af7-4170-8142-9bfc51e76412-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n\" (UID: \"1395867e-0af7-4170-8142-9bfc51e76412\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" Apr 16 18:14:37.176240 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.176219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1395867e-0af7-4170-8142-9bfc51e76412-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n\" (UID: \"1395867e-0af7-4170-8142-9bfc51e76412\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" Apr 16 18:14:37.176326 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.176309 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1395867e-0af7-4170-8142-9bfc51e76412-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n\" (UID: \"1395867e-0af7-4170-8142-9bfc51e76412\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" Apr 16 18:14:37.183682 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.183664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc7tc\" (UniqueName: \"kubernetes.io/projected/1395867e-0af7-4170-8142-9bfc51e76412-kube-api-access-qc7tc\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n\" (UID: \"1395867e-0af7-4170-8142-9bfc51e76412\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" Apr 16 18:14:37.330373 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.330271 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" Apr 16 18:14:37.443487 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.443452 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n"] Apr 16 18:14:37.446231 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:14:37.446201 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1395867e_0af7_4170_8142_9bfc51e76412.slice/crio-473ab166795ec418881ba63076e001c54348727c0c7e1813f034d62409259eea WatchSource:0}: Error finding container 473ab166795ec418881ba63076e001c54348727c0c7e1813f034d62409259eea: Status 404 returned error can't find the container with id 473ab166795ec418881ba63076e001c54348727c0c7e1813f034d62409259eea Apr 16 18:14:37.771401 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:37.771364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" event={"ID":"1395867e-0af7-4170-8142-9bfc51e76412","Type":"ContainerStarted","Data":"473ab166795ec418881ba63076e001c54348727c0c7e1813f034d62409259eea"} Apr 16 18:14:42.789127 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:42.789093 2576 generic.go:358] "Generic (PLEG): container finished" podID="1395867e-0af7-4170-8142-9bfc51e76412" containerID="1262a45888fc60584254c0ad8b39a4bfc6aa5ee61b0ee371860b6b8bd54296f7" exitCode=0 Apr 16 18:14:42.789517 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:42.789168 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" event={"ID":"1395867e-0af7-4170-8142-9bfc51e76412","Type":"ContainerDied","Data":"1262a45888fc60584254c0ad8b39a4bfc6aa5ee61b0ee371860b6b8bd54296f7"} Apr 16 18:14:45.800209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:45.800174 2576 generic.go:358] "Generic (PLEG): container finished" podID="1395867e-0af7-4170-8142-9bfc51e76412" containerID="cd74b862ff17d2b7f9f19aa452d985fed266fb317836398a7f8adca89c877b6d" exitCode=0 Apr 16 18:14:45.800589 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:45.800275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" event={"ID":"1395867e-0af7-4170-8142-9bfc51e76412","Type":"ContainerDied","Data":"cd74b862ff17d2b7f9f19aa452d985fed266fb317836398a7f8adca89c877b6d"} Apr 16 18:14:51.819787 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:51.819753 2576 generic.go:358] "Generic (PLEG): container finished" podID="1395867e-0af7-4170-8142-9bfc51e76412" containerID="811ace680e3f251180d3242228cf8592f64b5a70c87b583bd479ea7ca0a8a1fa" exitCode=0 Apr 16 18:14:51.820188 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:51.819836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" event={"ID":"1395867e-0af7-4170-8142-9bfc51e76412","Type":"ContainerDied","Data":"811ace680e3f251180d3242228cf8592f64b5a70c87b583bd479ea7ca0a8a1fa"} Apr 16 18:14:52.935741 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:52.935718 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" Apr 16 18:14:53.015639 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:53.015600 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1395867e-0af7-4170-8142-9bfc51e76412-bundle\") pod \"1395867e-0af7-4170-8142-9bfc51e76412\" (UID: \"1395867e-0af7-4170-8142-9bfc51e76412\") " Apr 16 18:14:53.015825 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:53.015665 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1395867e-0af7-4170-8142-9bfc51e76412-util\") pod \"1395867e-0af7-4170-8142-9bfc51e76412\" (UID: \"1395867e-0af7-4170-8142-9bfc51e76412\") " Apr 16 18:14:53.015825 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:53.015698 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc7tc\" (UniqueName: \"kubernetes.io/projected/1395867e-0af7-4170-8142-9bfc51e76412-kube-api-access-qc7tc\") pod \"1395867e-0af7-4170-8142-9bfc51e76412\" (UID: \"1395867e-0af7-4170-8142-9bfc51e76412\") " Apr 16 18:14:53.016332 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:53.016303 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1395867e-0af7-4170-8142-9bfc51e76412-bundle" (OuterVolumeSpecName: "bundle") pod "1395867e-0af7-4170-8142-9bfc51e76412" (UID: "1395867e-0af7-4170-8142-9bfc51e76412"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:14:53.017971 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:53.017948 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1395867e-0af7-4170-8142-9bfc51e76412-kube-api-access-qc7tc" (OuterVolumeSpecName: "kube-api-access-qc7tc") pod "1395867e-0af7-4170-8142-9bfc51e76412" (UID: "1395867e-0af7-4170-8142-9bfc51e76412"). InnerVolumeSpecName "kube-api-access-qc7tc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:14:53.020654 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:53.020631 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1395867e-0af7-4170-8142-9bfc51e76412-util" (OuterVolumeSpecName: "util") pod "1395867e-0af7-4170-8142-9bfc51e76412" (UID: "1395867e-0af7-4170-8142-9bfc51e76412"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:14:53.116833 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:53.116754 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1395867e-0af7-4170-8142-9bfc51e76412-util\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:14:53.116833 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:53.116783 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qc7tc\" (UniqueName: \"kubernetes.io/projected/1395867e-0af7-4170-8142-9bfc51e76412-kube-api-access-qc7tc\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:14:53.116833 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:53.116796 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1395867e-0af7-4170-8142-9bfc51e76412-bundle\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:14:53.826967 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:53.826928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" event={"ID":"1395867e-0af7-4170-8142-9bfc51e76412","Type":"ContainerDied","Data":"473ab166795ec418881ba63076e001c54348727c0c7e1813f034d62409259eea"} Apr 16 18:14:53.826967 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:53.826964 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4rs7n" Apr 16 18:14:53.826967 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:53.826967 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="473ab166795ec418881ba63076e001c54348727c0c7e1813f034d62409259eea" Apr 16 18:14:58.660020 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.659987 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx"] Apr 16 18:14:58.660393 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.660275 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1395867e-0af7-4170-8142-9bfc51e76412" containerName="util" Apr 16 18:14:58.660393 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.660287 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1395867e-0af7-4170-8142-9bfc51e76412" containerName="util" Apr 16 18:14:58.660393 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.660296 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1395867e-0af7-4170-8142-9bfc51e76412" containerName="extract" Apr 16 18:14:58.660393 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.660302 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1395867e-0af7-4170-8142-9bfc51e76412" containerName="extract" Apr 16 18:14:58.660393 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.660316 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1395867e-0af7-4170-8142-9bfc51e76412" containerName="pull" Apr 16 18:14:58.660393 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.660322 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1395867e-0af7-4170-8142-9bfc51e76412" containerName="pull" Apr 16 18:14:58.660393 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.660362 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1395867e-0af7-4170-8142-9bfc51e76412" containerName="extract" Apr 16 18:14:58.707509 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.707475 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx"] Apr 16 18:14:58.707661 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.707608 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx" Apr 16 18:14:58.710087 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.710061 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:14:58.710230 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.710179 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:14:58.710320 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.710230 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:14:58.710382 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.710337 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-tqk7f\"" Apr 16 18:14:58.858560 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.858531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bd564c09-83e1-4527-b197-f93c2e26d800-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx\" (UID: \"bd564c09-83e1-4527-b197-f93c2e26d800\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx" Apr 16 18:14:58.858729 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.858591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s4k6\" (UniqueName: \"kubernetes.io/projected/bd564c09-83e1-4527-b197-f93c2e26d800-kube-api-access-4s4k6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx\" (UID: \"bd564c09-83e1-4527-b197-f93c2e26d800\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx" Apr 16 18:14:58.959635 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.959558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bd564c09-83e1-4527-b197-f93c2e26d800-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx\" (UID: \"bd564c09-83e1-4527-b197-f93c2e26d800\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx" Apr 16 18:14:58.959635 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.959620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4s4k6\" (UniqueName: \"kubernetes.io/projected/bd564c09-83e1-4527-b197-f93c2e26d800-kube-api-access-4s4k6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx\" (UID: \"bd564c09-83e1-4527-b197-f93c2e26d800\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx" Apr 16 18:14:58.961893 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.961861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bd564c09-83e1-4527-b197-f93c2e26d800-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx\" (UID: \"bd564c09-83e1-4527-b197-f93c2e26d800\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx" Apr 16 18:14:58.967520 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:58.967500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s4k6\" (UniqueName: \"kubernetes.io/projected/bd564c09-83e1-4527-b197-f93c2e26d800-kube-api-access-4s4k6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx\" (UID: \"bd564c09-83e1-4527-b197-f93c2e26d800\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx" Apr 16 18:14:59.017694 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:59.017663 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx" Apr 16 18:14:59.137087 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:59.137063 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx"] Apr 16 18:14:59.139374 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:14:59.139343 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd564c09_83e1_4527_b197_f93c2e26d800.slice/crio-97d845cbcbc968adb30d32ce26638a2c2b75f876c4bb80bbc097869605d91ae7 WatchSource:0}: Error finding container 97d845cbcbc968adb30d32ce26638a2c2b75f876c4bb80bbc097869605d91ae7: Status 404 returned error can't find the container with id 97d845cbcbc968adb30d32ce26638a2c2b75f876c4bb80bbc097869605d91ae7 Apr 16 18:14:59.795737 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:59.795713 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:14:59.844876 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:14:59.844708 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx" event={"ID":"bd564c09-83e1-4527-b197-f93c2e26d800","Type":"ContainerStarted","Data":"97d845cbcbc968adb30d32ce26638a2c2b75f876c4bb80bbc097869605d91ae7"} Apr 16 18:15:08.514977 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.514942 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-vw2bw"] Apr 16 18:15:08.517150 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.517131 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:08.519148 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.519124 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 18:15:08.519148 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.519136 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:15:08.519333 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.519184 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-g6j8t\"" Apr 16 18:15:08.525276 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.525231 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-vw2bw"] Apr 16 18:15:08.641480 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.641446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9ftm\" (UniqueName: \"kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-kube-api-access-r9ftm\") pod \"keda-operator-ffbb595cb-vw2bw\" (UID: \"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa\") " pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:08.641657 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.641500 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-cabundle0\") pod \"keda-operator-ffbb595cb-vw2bw\" (UID: \"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa\") " pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:08.641657 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.641590 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates\") pod \"keda-operator-ffbb595cb-vw2bw\" (UID: \"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa\") " pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:08.742687 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.742646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates\") pod \"keda-operator-ffbb595cb-vw2bw\" (UID: \"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa\") " pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:08.742890 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.742705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9ftm\" (UniqueName: \"kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-kube-api-access-r9ftm\") pod \"keda-operator-ffbb595cb-vw2bw\" (UID: \"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa\") " pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:08.742890 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:08.742842 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:15:08.742890 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:08.742867 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:15:08.742890 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:08.742880 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vw2bw: references non-existent secret key: ca.crt Apr 16 18:15:08.743112 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.742900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-cabundle0\") pod \"keda-operator-ffbb595cb-vw2bw\" (UID: \"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa\") " pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:08.743112 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:08.742965 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates podName:7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa nodeName:}" failed. No retries permitted until 2026-04-16 18:15:09.242927034 +0000 UTC m=+309.946258536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates") pod "keda-operator-ffbb595cb-vw2bw" (UID: "7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa") : references non-existent secret key: ca.crt Apr 16 18:15:08.743573 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.743548 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-cabundle0\") pod \"keda-operator-ffbb595cb-vw2bw\" (UID: \"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa\") " pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:08.752828 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.752803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9ftm\" (UniqueName: \"kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-kube-api-access-r9ftm\") pod \"keda-operator-ffbb595cb-vw2bw\" (UID: \"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa\") " pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:08.775050 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.774983 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw"] Apr 16 18:15:08.777088 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.777067 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:08.778924 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.778899 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 18:15:08.785496 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.785477 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw"] Apr 16 18:15:08.871204 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.871168 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx" event={"ID":"bd564c09-83e1-4527-b197-f93c2e26d800","Type":"ContainerStarted","Data":"6056c12ebacf976bcc1a3135ef6d37f1d4c81b464272a3c05f61ed12af47ed97"} Apr 16 18:15:08.871389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.871290 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx" Apr 16 18:15:08.887454 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.887382 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx" podStartSLOduration=2.034448834 podStartE2EDuration="10.887363192s" podCreationTimestamp="2026-04-16 18:14:58 +0000 UTC" firstStartedPulling="2026-04-16 18:14:59.140993311 +0000 UTC m=+299.844324814" lastFinishedPulling="2026-04-16 18:15:07.99390767 +0000 UTC m=+308.697239172" observedRunningTime="2026-04-16 18:15:08.886702682 +0000 UTC m=+309.590034216" watchObservedRunningTime="2026-04-16 18:15:08.887363192 +0000 UTC m=+309.590694716" Apr 16 18:15:08.944385 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.944343 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/13b3319e-53fb-43f5-b689-b1cb0a107859-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-v7rzw\" (UID: \"13b3319e-53fb-43f5-b689-b1cb0a107859\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:08.944582 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.944401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbkt4\" (UniqueName: \"kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-kube-api-access-dbkt4\") pod \"keda-metrics-apiserver-7c9f485588-v7rzw\" (UID: \"13b3319e-53fb-43f5-b689-b1cb0a107859\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:08.944582 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:08.944464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-certificates\") pod \"keda-metrics-apiserver-7c9f485588-v7rzw\" (UID: \"13b3319e-53fb-43f5-b689-b1cb0a107859\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:09.044887 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.044802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-certificates\") pod \"keda-metrics-apiserver-7c9f485588-v7rzw\" (UID: \"13b3319e-53fb-43f5-b689-b1cb0a107859\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:09.045042 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:09.044949 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:15:09.045042 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:09.044974 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:15:09.045042 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.044971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/13b3319e-53fb-43f5-b689-b1cb0a107859-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-v7rzw\" (UID: \"13b3319e-53fb-43f5-b689-b1cb0a107859\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:09.045042 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:09.044999 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw: references non-existent secret key: tls.crt Apr 16 18:15:09.045200 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:09.045068 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-certificates podName:13b3319e-53fb-43f5-b689-b1cb0a107859 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:09.545046861 +0000 UTC m=+310.248378362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-certificates") pod "keda-metrics-apiserver-7c9f485588-v7rzw" (UID: "13b3319e-53fb-43f5-b689-b1cb0a107859") : references non-existent secret key: tls.crt Apr 16 18:15:09.045200 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.045090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbkt4\" (UniqueName: \"kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-kube-api-access-dbkt4\") pod \"keda-metrics-apiserver-7c9f485588-v7rzw\" (UID: \"13b3319e-53fb-43f5-b689-b1cb0a107859\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:09.045340 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.045321 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/13b3319e-53fb-43f5-b689-b1cb0a107859-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-v7rzw\" (UID: \"13b3319e-53fb-43f5-b689-b1cb0a107859\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:09.056688 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.056655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbkt4\" (UniqueName: \"kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-kube-api-access-dbkt4\") pod \"keda-metrics-apiserver-7c9f485588-v7rzw\" (UID: \"13b3319e-53fb-43f5-b689-b1cb0a107859\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:09.069284 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.069258 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-6hvmn"] Apr 16 18:15:09.071773 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.071759 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6hvmn" Apr 16 18:15:09.073872 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.073854 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 18:15:09.080701 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.080678 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6hvmn"] Apr 16 18:15:09.247016 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.246978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sswdz\" (UniqueName: \"kubernetes.io/projected/bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb-kube-api-access-sswdz\") pod \"keda-admission-cf49989db-6hvmn\" (UID: \"bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb\") " pod="openshift-keda/keda-admission-cf49989db-6hvmn" Apr 16 18:15:09.247016 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.247019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates\") pod \"keda-operator-ffbb595cb-vw2bw\" (UID: \"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa\") " pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:09.247299 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.247055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb-certificates\") pod \"keda-admission-cf49989db-6hvmn\" (UID: \"bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb\") " pod="openshift-keda/keda-admission-cf49989db-6hvmn" Apr 16 18:15:09.247299 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:09.247197 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:15:09.247299 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:09.247221 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:15:09.247299 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:09.247235 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vw2bw: references non-existent secret key: ca.crt Apr 16 18:15:09.247494 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:09.247323 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates podName:7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa nodeName:}" failed. No retries permitted until 2026-04-16 18:15:10.247301453 +0000 UTC m=+310.950632955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates") pod "keda-operator-ffbb595cb-vw2bw" (UID: "7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa") : references non-existent secret key: ca.crt Apr 16 18:15:09.347695 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.347605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sswdz\" (UniqueName: \"kubernetes.io/projected/bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb-kube-api-access-sswdz\") pod \"keda-admission-cf49989db-6hvmn\" (UID: \"bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb\") " pod="openshift-keda/keda-admission-cf49989db-6hvmn" Apr 16 18:15:09.347695 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.347674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb-certificates\") pod \"keda-admission-cf49989db-6hvmn\" (UID: \"bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb\") " pod="openshift-keda/keda-admission-cf49989db-6hvmn" Apr 16 18:15:09.350065 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.350041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb-certificates\") pod \"keda-admission-cf49989db-6hvmn\" (UID: \"bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb\") " pod="openshift-keda/keda-admission-cf49989db-6hvmn" Apr 16 18:15:09.354688 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.354664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sswdz\" (UniqueName: \"kubernetes.io/projected/bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb-kube-api-access-sswdz\") pod \"keda-admission-cf49989db-6hvmn\" (UID: \"bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb\") " pod="openshift-keda/keda-admission-cf49989db-6hvmn" Apr 16 18:15:09.382539 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.382508 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6hvmn" Apr 16 18:15:09.498593 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.498549 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6hvmn"] Apr 16 18:15:09.502211 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:15:09.502183 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf5ebc36_ffdf_49a9_9c9a_ce6630997bbb.slice/crio-aa45f89ec4ee8c9fd901dc925f4b65ee76345cefb12f00701e2792176549191e WatchSource:0}: Error finding container aa45f89ec4ee8c9fd901dc925f4b65ee76345cefb12f00701e2792176549191e: Status 404 returned error can't find the container with id aa45f89ec4ee8c9fd901dc925f4b65ee76345cefb12f00701e2792176549191e Apr 16 18:15:09.503500 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.503482 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:15:09.549515 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.549480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-certificates\") pod \"keda-metrics-apiserver-7c9f485588-v7rzw\" (UID: \"13b3319e-53fb-43f5-b689-b1cb0a107859\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:09.549848 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:09.549596 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:15:09.549848 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:09.549609 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:15:09.549848 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:09.549626 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw: references non-existent secret key: tls.crt Apr 16 18:15:09.549848 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:09.549677 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-certificates podName:13b3319e-53fb-43f5-b689-b1cb0a107859 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:10.549662353 +0000 UTC m=+311.252993856 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-certificates") pod "keda-metrics-apiserver-7c9f485588-v7rzw" (UID: "13b3319e-53fb-43f5-b689-b1cb0a107859") : references non-existent secret key: tls.crt Apr 16 18:15:09.875433 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:09.875398 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6hvmn" event={"ID":"bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb","Type":"ContainerStarted","Data":"aa45f89ec4ee8c9fd901dc925f4b65ee76345cefb12f00701e2792176549191e"} Apr 16 18:15:10.255983 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:10.255905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates\") pod \"keda-operator-ffbb595cb-vw2bw\" (UID: \"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa\") " pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:10.256115 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:10.256018 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:15:10.256115 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:10.256030 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:15:10.256115 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:10.256039 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vw2bw: references non-existent secret key: ca.crt Apr 16 18:15:10.256115 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:10.256085 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates podName:7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa nodeName:}" failed. No retries permitted until 2026-04-16 18:15:12.256072918 +0000 UTC m=+312.959404420 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates") pod "keda-operator-ffbb595cb-vw2bw" (UID: "7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa") : references non-existent secret key: ca.crt Apr 16 18:15:10.559333 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:10.559257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-certificates\") pod \"keda-metrics-apiserver-7c9f485588-v7rzw\" (UID: \"13b3319e-53fb-43f5-b689-b1cb0a107859\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:10.559673 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:10.559357 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:15:10.559673 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:10.559380 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:15:10.559673 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:10.559398 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw: references non-existent secret key: tls.crt Apr 16 18:15:10.559673 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:10.559450 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-certificates podName:13b3319e-53fb-43f5-b689-b1cb0a107859 nodeName:}" failed. No retries permitted until 2026-04-16 18:15:12.559434811 +0000 UTC m=+313.262766313 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-certificates") pod "keda-metrics-apiserver-7c9f485588-v7rzw" (UID: "13b3319e-53fb-43f5-b689-b1cb0a107859") : references non-existent secret key: tls.crt Apr 16 18:15:10.881875 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:10.881841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6hvmn" event={"ID":"bf5ebc36-ffdf-49a9-9c9a-ce6630997bbb","Type":"ContainerStarted","Data":"b705cac7731de3ce5ee5c869a6545b897c9129af8df7b5c2d32fa1b9f9af6d7f"} Apr 16 18:15:10.882045 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:10.881977 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-6hvmn" Apr 16 18:15:10.897668 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:10.897621 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-6hvmn" podStartSLOduration=0.836739498 podStartE2EDuration="1.897608178s" podCreationTimestamp="2026-04-16 18:15:09 +0000 UTC" firstStartedPulling="2026-04-16 18:15:09.503602578 +0000 UTC m=+310.206934081" lastFinishedPulling="2026-04-16 18:15:10.564471258 +0000 UTC m=+311.267802761" observedRunningTime="2026-04-16 18:15:10.896486155 +0000 UTC m=+311.599817692" watchObservedRunningTime="2026-04-16 18:15:10.897608178 +0000 UTC m=+311.600939706" Apr 16 18:15:12.274361 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:12.274327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates\") pod \"keda-operator-ffbb595cb-vw2bw\" (UID: \"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa\") " pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:12.274771 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:12.274446 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:15:12.274771 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:12.274460 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:15:12.274771 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:12.274469 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vw2bw: references non-existent secret key: ca.crt Apr 16 18:15:12.274771 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:15:12.274515 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates podName:7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa nodeName:}" failed. No retries permitted until 2026-04-16 18:15:16.27450292 +0000 UTC m=+316.977834421 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates") pod "keda-operator-ffbb595cb-vw2bw" (UID: "7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa") : references non-existent secret key: ca.crt Apr 16 18:15:12.576924 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:12.576826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-certificates\") pod \"keda-metrics-apiserver-7c9f485588-v7rzw\" (UID: \"13b3319e-53fb-43f5-b689-b1cb0a107859\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:12.579354 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:12.579337 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/13b3319e-53fb-43f5-b689-b1cb0a107859-certificates\") pod \"keda-metrics-apiserver-7c9f485588-v7rzw\" (UID: \"13b3319e-53fb-43f5-b689-b1cb0a107859\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:12.687595 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:12.687563 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:12.804648 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:12.804504 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw"] Apr 16 18:15:12.807423 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:15:12.807390 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13b3319e_53fb_43f5_b689_b1cb0a107859.slice/crio-8ef6f14a7eae971b8599cf5bdf4fbfae6f106ab11899d92f86b5d2a339485611 WatchSource:0}: Error finding container 8ef6f14a7eae971b8599cf5bdf4fbfae6f106ab11899d92f86b5d2a339485611: Status 404 returned error can't find the container with id 8ef6f14a7eae971b8599cf5bdf4fbfae6f106ab11899d92f86b5d2a339485611 Apr 16 18:15:12.888223 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:12.888188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" event={"ID":"13b3319e-53fb-43f5-b689-b1cb0a107859","Type":"ContainerStarted","Data":"8ef6f14a7eae971b8599cf5bdf4fbfae6f106ab11899d92f86b5d2a339485611"} Apr 16 18:15:15.901646 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:15.901613 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" event={"ID":"13b3319e-53fb-43f5-b689-b1cb0a107859","Type":"ContainerStarted","Data":"7934b66c85e9ab41c302cfb68fe1eb557b4c3dd46801d6522142afa0400f3165"} Apr 16 18:15:15.902010 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:15.901670 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:15.917939 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:15.917893 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" podStartSLOduration=5.787714355 podStartE2EDuration="7.917877451s" podCreationTimestamp="2026-04-16 18:15:08 +0000 UTC" firstStartedPulling="2026-04-16 18:15:12.809110805 +0000 UTC m=+313.512442308" lastFinishedPulling="2026-04-16 18:15:14.939273892 +0000 UTC m=+315.642605404" observedRunningTime="2026-04-16 18:15:15.916014821 +0000 UTC m=+316.619346344" watchObservedRunningTime="2026-04-16 18:15:15.917877451 +0000 UTC m=+316.621208974" Apr 16 18:15:16.308775 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:16.308679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates\") pod \"keda-operator-ffbb595cb-vw2bw\" (UID: \"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa\") " pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:16.311054 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:16.311032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa-certificates\") pod \"keda-operator-ffbb595cb-vw2bw\" (UID: \"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa\") " pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:16.327955 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:16.327924 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:16.443716 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:16.443654 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-vw2bw"] Apr 16 18:15:16.446092 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:15:16.446065 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c7a6ee9_b8d7_4906_a9be_22ca6504b9aa.slice/crio-32b68cd14f1977c684caef97640276e93e958d5cc8e68b6ea0a88712c22cd553 WatchSource:0}: Error finding container 32b68cd14f1977c684caef97640276e93e958d5cc8e68b6ea0a88712c22cd553: Status 404 returned error can't find the container with id 32b68cd14f1977c684caef97640276e93e958d5cc8e68b6ea0a88712c22cd553 Apr 16 18:15:16.906003 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:16.905963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" event={"ID":"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa","Type":"ContainerStarted","Data":"32b68cd14f1977c684caef97640276e93e958d5cc8e68b6ea0a88712c22cd553"} Apr 16 18:15:19.915856 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:19.915827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" event={"ID":"7c7a6ee9-b8d7-4906-a9be-22ca6504b9aa","Type":"ContainerStarted","Data":"b81dbbf2be9d0e8576555c3c7d6f17d75cfb165bd93a45a40ca6daadc944fd2b"} Apr 16 18:15:19.916214 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:19.915918 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:15:19.930768 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:19.930723 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" podStartSLOduration=9.32816402 podStartE2EDuration="11.930710675s" podCreationTimestamp="2026-04-16 18:15:08 +0000 UTC" firstStartedPulling="2026-04-16 18:15:16.4475442 +0000 UTC m=+317.150875701" lastFinishedPulling="2026-04-16 18:15:19.050090854 +0000 UTC m=+319.753422356" observedRunningTime="2026-04-16 18:15:19.929316171 +0000 UTC m=+320.632647696" watchObservedRunningTime="2026-04-16 18:15:19.930710675 +0000 UTC m=+320.634042199" Apr 16 18:15:26.910152 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:26.910114 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-v7rzw" Apr 16 18:15:29.877829 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:29.877797 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m2mxx" Apr 16 18:15:31.887658 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:31.887630 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-6hvmn" Apr 16 18:15:40.921459 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:15:40.921424 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-vw2bw" Apr 16 18:16:13.693421 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.693326 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-65589c6846-r7tnc"] Apr 16 18:16:13.699110 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.699089 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-r7tnc" Apr 16 18:16:13.701629 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.701435 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 18:16:13.701629 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.701468 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-9mjxp\"" Apr 16 18:16:13.701629 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.701511 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:16:13.701629 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.701559 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:16:13.702463 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.702442 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt"] Apr 16 18:16:13.704540 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.704518 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt" Apr 16 18:16:13.705774 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.705752 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-r7tnc"] Apr 16 18:16:13.707141 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.706732 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:16:13.707141 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.706783 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-9df88\"" Apr 16 18:16:13.715095 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.715068 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt"] Apr 16 18:16:13.731036 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.731007 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-q4gnz"] Apr 16 18:16:13.733659 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.733642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-q4gnz" Apr 16 18:16:13.735798 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.735777 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vzcsd\"" Apr 16 18:16:13.735903 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.735777 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:16:13.741971 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.741941 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-q4gnz"] Apr 16 18:16:13.794579 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.794541 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f9b4a97-7207-456f-97e8-fcd7cfce8793-cert\") pod \"kserve-controller-manager-65589c6846-r7tnc\" (UID: \"5f9b4a97-7207-456f-97e8-fcd7cfce8793\") " pod="kserve/kserve-controller-manager-65589c6846-r7tnc" Apr 16 18:16:13.794727 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.794600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/646b4080-16ab-45ab-be6e-1183ea42a5d3-data\") pod \"seaweedfs-86cc847c5c-q4gnz\" (UID: \"646b4080-16ab-45ab-be6e-1183ea42a5d3\") " pod="kserve/seaweedfs-86cc847c5c-q4gnz" Apr 16 18:16:13.794727 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.794644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlwdd\" (UniqueName: \"kubernetes.io/projected/24e1fec6-80d5-4325-a211-d58a433fde30-kube-api-access-jlwdd\") pod \"llmisvc-controller-manager-68cc5db7c4-dvkwt\" (UID: \"24e1fec6-80d5-4325-a211-d58a433fde30\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt" Apr 16 18:16:13.794727 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.794687 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2qcl\" (UniqueName: \"kubernetes.io/projected/5f9b4a97-7207-456f-97e8-fcd7cfce8793-kube-api-access-w2qcl\") pod \"kserve-controller-manager-65589c6846-r7tnc\" (UID: \"5f9b4a97-7207-456f-97e8-fcd7cfce8793\") " pod="kserve/kserve-controller-manager-65589c6846-r7tnc" Apr 16 18:16:13.794727 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.794714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gj6s\" (UniqueName: \"kubernetes.io/projected/646b4080-16ab-45ab-be6e-1183ea42a5d3-kube-api-access-4gj6s\") pod \"seaweedfs-86cc847c5c-q4gnz\" (UID: \"646b4080-16ab-45ab-be6e-1183ea42a5d3\") " pod="kserve/seaweedfs-86cc847c5c-q4gnz" Apr 16 18:16:13.794927 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.794803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24e1fec6-80d5-4325-a211-d58a433fde30-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dvkwt\" (UID: \"24e1fec6-80d5-4325-a211-d58a433fde30\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt" Apr 16 18:16:13.895517 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.895486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24e1fec6-80d5-4325-a211-d58a433fde30-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dvkwt\" (UID: \"24e1fec6-80d5-4325-a211-d58a433fde30\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt" Apr 16 18:16:13.895695 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.895524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f9b4a97-7207-456f-97e8-fcd7cfce8793-cert\") pod \"kserve-controller-manager-65589c6846-r7tnc\" (UID: \"5f9b4a97-7207-456f-97e8-fcd7cfce8793\") " pod="kserve/kserve-controller-manager-65589c6846-r7tnc" Apr 16 18:16:13.895695 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.895601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/646b4080-16ab-45ab-be6e-1183ea42a5d3-data\") pod \"seaweedfs-86cc847c5c-q4gnz\" (UID: \"646b4080-16ab-45ab-be6e-1183ea42a5d3\") " pod="kserve/seaweedfs-86cc847c5c-q4gnz" Apr 16 18:16:13.895695 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.895672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlwdd\" (UniqueName: \"kubernetes.io/projected/24e1fec6-80d5-4325-a211-d58a433fde30-kube-api-access-jlwdd\") pod \"llmisvc-controller-manager-68cc5db7c4-dvkwt\" (UID: \"24e1fec6-80d5-4325-a211-d58a433fde30\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt" Apr 16 18:16:13.895859 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.895723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2qcl\" (UniqueName: \"kubernetes.io/projected/5f9b4a97-7207-456f-97e8-fcd7cfce8793-kube-api-access-w2qcl\") pod \"kserve-controller-manager-65589c6846-r7tnc\" (UID: \"5f9b4a97-7207-456f-97e8-fcd7cfce8793\") " pod="kserve/kserve-controller-manager-65589c6846-r7tnc" Apr 16 18:16:13.896001 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.895973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/646b4080-16ab-45ab-be6e-1183ea42a5d3-data\") pod \"seaweedfs-86cc847c5c-q4gnz\" (UID: \"646b4080-16ab-45ab-be6e-1183ea42a5d3\") " pod="kserve/seaweedfs-86cc847c5c-q4gnz" Apr 16 18:16:13.896082 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.895862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gj6s\" (UniqueName: \"kubernetes.io/projected/646b4080-16ab-45ab-be6e-1183ea42a5d3-kube-api-access-4gj6s\") pod \"seaweedfs-86cc847c5c-q4gnz\" (UID: \"646b4080-16ab-45ab-be6e-1183ea42a5d3\") " pod="kserve/seaweedfs-86cc847c5c-q4gnz" Apr 16 18:16:13.898552 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.898527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24e1fec6-80d5-4325-a211-d58a433fde30-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dvkwt\" (UID: \"24e1fec6-80d5-4325-a211-d58a433fde30\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt" Apr 16 18:16:13.898656 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.898529 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f9b4a97-7207-456f-97e8-fcd7cfce8793-cert\") pod \"kserve-controller-manager-65589c6846-r7tnc\" (UID: \"5f9b4a97-7207-456f-97e8-fcd7cfce8793\") " pod="kserve/kserve-controller-manager-65589c6846-r7tnc" Apr 16 18:16:13.904500 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.904475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2qcl\" (UniqueName: \"kubernetes.io/projected/5f9b4a97-7207-456f-97e8-fcd7cfce8793-kube-api-access-w2qcl\") pod \"kserve-controller-manager-65589c6846-r7tnc\" (UID: \"5f9b4a97-7207-456f-97e8-fcd7cfce8793\") " pod="kserve/kserve-controller-manager-65589c6846-r7tnc" Apr 16 18:16:13.904719 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.904697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gj6s\" (UniqueName: \"kubernetes.io/projected/646b4080-16ab-45ab-be6e-1183ea42a5d3-kube-api-access-4gj6s\") pod \"seaweedfs-86cc847c5c-q4gnz\" (UID: \"646b4080-16ab-45ab-be6e-1183ea42a5d3\") " pod="kserve/seaweedfs-86cc847c5c-q4gnz" Apr 16 18:16:13.905009 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:13.904981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlwdd\" (UniqueName: \"kubernetes.io/projected/24e1fec6-80d5-4325-a211-d58a433fde30-kube-api-access-jlwdd\") pod \"llmisvc-controller-manager-68cc5db7c4-dvkwt\" (UID: \"24e1fec6-80d5-4325-a211-d58a433fde30\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt" Apr 16 18:16:14.010733 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:14.010644 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-r7tnc" Apr 16 18:16:14.019471 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:14.019445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt" Apr 16 18:16:14.043388 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:14.043358 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-q4gnz" Apr 16 18:16:14.160836 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:14.160795 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-r7tnc"] Apr 16 18:16:14.163805 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:16:14.163778 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f9b4a97_7207_456f_97e8_fcd7cfce8793.slice/crio-ec7628b4d9485ed41c92ca72791c00e9f9c57d880aa1af2c061dadc8d64cc312 WatchSource:0}: Error finding container ec7628b4d9485ed41c92ca72791c00e9f9c57d880aa1af2c061dadc8d64cc312: Status 404 returned error can't find the container with id ec7628b4d9485ed41c92ca72791c00e9f9c57d880aa1af2c061dadc8d64cc312 Apr 16 18:16:14.171894 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:14.171870 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt"] Apr 16 18:16:14.174528 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:16:14.174504 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod24e1fec6_80d5_4325_a211_d58a433fde30.slice/crio-b6a6afcc42c37f5c6d730ea0b86cd699acfc90a78c1ce906c878619f95fdbe0c WatchSource:0}: Error finding container b6a6afcc42c37f5c6d730ea0b86cd699acfc90a78c1ce906c878619f95fdbe0c: Status 404 returned error can't find the container with id b6a6afcc42c37f5c6d730ea0b86cd699acfc90a78c1ce906c878619f95fdbe0c Apr 16 18:16:14.196547 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:14.196526 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-q4gnz"] Apr 16 18:16:14.198136 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:16:14.198112 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod646b4080_16ab_45ab_be6e_1183ea42a5d3.slice/crio-43699ced8f57c9f5145dbf6f911cb13ea683cc050fd470f1ed7d1ed81b3a59d1 WatchSource:0}: Error finding container 43699ced8f57c9f5145dbf6f911cb13ea683cc050fd470f1ed7d1ed81b3a59d1: Status 404 returned error can't find the container with id 43699ced8f57c9f5145dbf6f911cb13ea683cc050fd470f1ed7d1ed81b3a59d1 Apr 16 18:16:15.089667 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:15.089611 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-r7tnc" event={"ID":"5f9b4a97-7207-456f-97e8-fcd7cfce8793","Type":"ContainerStarted","Data":"ec7628b4d9485ed41c92ca72791c00e9f9c57d880aa1af2c061dadc8d64cc312"} Apr 16 18:16:15.091161 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:15.091127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-q4gnz" event={"ID":"646b4080-16ab-45ab-be6e-1183ea42a5d3","Type":"ContainerStarted","Data":"43699ced8f57c9f5145dbf6f911cb13ea683cc050fd470f1ed7d1ed81b3a59d1"} Apr 16 18:16:15.092745 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:15.092718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt" event={"ID":"24e1fec6-80d5-4325-a211-d58a433fde30","Type":"ContainerStarted","Data":"b6a6afcc42c37f5c6d730ea0b86cd699acfc90a78c1ce906c878619f95fdbe0c"} Apr 16 18:16:19.108199 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:19.108166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-r7tnc" event={"ID":"5f9b4a97-7207-456f-97e8-fcd7cfce8793","Type":"ContainerStarted","Data":"aecc76065c8a61f925ea73b2a561f8613bcb6012fb16fb63640dc21fb1a19ba7"} Apr 16 18:16:19.108640 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:19.108273 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-65589c6846-r7tnc" Apr 16 18:16:19.109538 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:19.109516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-q4gnz" event={"ID":"646b4080-16ab-45ab-be6e-1183ea42a5d3","Type":"ContainerStarted","Data":"26057a2d10105e91cf1146ed71df5f70aeb6157ae16a86641c6fd2208558f58f"} Apr 16 18:16:19.109637 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:19.109617 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-q4gnz" Apr 16 18:16:19.110852 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:19.110834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt" event={"ID":"24e1fec6-80d5-4325-a211-d58a433fde30","Type":"ContainerStarted","Data":"1e3523b2ee7c4cc18090978c3586b170cb8ed7fdda11f600d0dc2dc416097edd"} Apr 16 18:16:19.110971 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:19.110957 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt" Apr 16 18:16:19.125927 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:19.125890 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-65589c6846-r7tnc" podStartSLOduration=1.506699295 podStartE2EDuration="6.125876668s" podCreationTimestamp="2026-04-16 18:16:13 +0000 UTC" firstStartedPulling="2026-04-16 18:16:14.165573418 +0000 UTC m=+374.868904925" lastFinishedPulling="2026-04-16 18:16:18.784750787 +0000 UTC m=+379.488082298" observedRunningTime="2026-04-16 18:16:19.123205546 +0000 UTC m=+379.826537090" watchObservedRunningTime="2026-04-16 18:16:19.125876668 +0000 UTC m=+379.829208193" Apr 16 18:16:19.139014 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:19.138975 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt" podStartSLOduration=1.529757925 podStartE2EDuration="6.138964073s" podCreationTimestamp="2026-04-16 18:16:13 +0000 UTC" firstStartedPulling="2026-04-16 18:16:14.176147338 +0000 UTC m=+374.879478859" lastFinishedPulling="2026-04-16 18:16:18.785353492 +0000 UTC m=+379.488685007" observedRunningTime="2026-04-16 18:16:19.137919237 +0000 UTC m=+379.841250760" watchObservedRunningTime="2026-04-16 18:16:19.138964073 +0000 UTC m=+379.842295628" Apr 16 18:16:19.153374 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:19.153337 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-q4gnz" podStartSLOduration=1.5316386469999999 podStartE2EDuration="6.153325322s" podCreationTimestamp="2026-04-16 18:16:13 +0000 UTC" firstStartedPulling="2026-04-16 18:16:14.199453576 +0000 UTC m=+374.902785079" lastFinishedPulling="2026-04-16 18:16:18.821140251 +0000 UTC m=+379.524471754" observedRunningTime="2026-04-16 18:16:19.152012702 +0000 UTC m=+379.855344228" watchObservedRunningTime="2026-04-16 18:16:19.153325322 +0000 UTC m=+379.856656845" Apr 16 18:16:25.115648 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:25.115621 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-q4gnz" Apr 16 18:16:50.115309 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:50.115272 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dvkwt" Apr 16 18:16:50.118261 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:50.118229 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-65589c6846-r7tnc" Apr 16 18:16:51.517623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.517588 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-r7tnc"] Apr 16 18:16:51.518018 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.517831 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-65589c6846-r7tnc" podUID="5f9b4a97-7207-456f-97e8-fcd7cfce8793" containerName="manager" containerID="cri-o://aecc76065c8a61f925ea73b2a561f8613bcb6012fb16fb63640dc21fb1a19ba7" gracePeriod=10 Apr 16 18:16:51.536192 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.536164 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-65589c6846-s4v7h"] Apr 16 18:16:51.604498 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.604461 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-s4v7h"] Apr 16 18:16:51.604644 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.604590 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-s4v7h" Apr 16 18:16:51.714863 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.714811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8tjp\" (UniqueName: \"kubernetes.io/projected/92b32003-ca4f-49cb-8b14-026672e56812-kube-api-access-r8tjp\") pod \"kserve-controller-manager-65589c6846-s4v7h\" (UID: \"92b32003-ca4f-49cb-8b14-026672e56812\") " pod="kserve/kserve-controller-manager-65589c6846-s4v7h" Apr 16 18:16:51.714987 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.714928 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92b32003-ca4f-49cb-8b14-026672e56812-cert\") pod \"kserve-controller-manager-65589c6846-s4v7h\" (UID: \"92b32003-ca4f-49cb-8b14-026672e56812\") " pod="kserve/kserve-controller-manager-65589c6846-s4v7h" Apr 16 18:16:51.789391 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.789371 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-r7tnc" Apr 16 18:16:51.815864 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.815836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8tjp\" (UniqueName: \"kubernetes.io/projected/92b32003-ca4f-49cb-8b14-026672e56812-kube-api-access-r8tjp\") pod \"kserve-controller-manager-65589c6846-s4v7h\" (UID: \"92b32003-ca4f-49cb-8b14-026672e56812\") " pod="kserve/kserve-controller-manager-65589c6846-s4v7h" Apr 16 18:16:51.815864 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.815872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92b32003-ca4f-49cb-8b14-026672e56812-cert\") pod \"kserve-controller-manager-65589c6846-s4v7h\" (UID: \"92b32003-ca4f-49cb-8b14-026672e56812\") " pod="kserve/kserve-controller-manager-65589c6846-s4v7h" Apr 16 18:16:51.818491 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.818469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92b32003-ca4f-49cb-8b14-026672e56812-cert\") pod \"kserve-controller-manager-65589c6846-s4v7h\" (UID: \"92b32003-ca4f-49cb-8b14-026672e56812\") " pod="kserve/kserve-controller-manager-65589c6846-s4v7h" Apr 16 18:16:51.822971 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.822951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8tjp\" (UniqueName: \"kubernetes.io/projected/92b32003-ca4f-49cb-8b14-026672e56812-kube-api-access-r8tjp\") pod \"kserve-controller-manager-65589c6846-s4v7h\" (UID: \"92b32003-ca4f-49cb-8b14-026672e56812\") " pod="kserve/kserve-controller-manager-65589c6846-s4v7h" Apr 16 18:16:51.916461 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.916427 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2qcl\" (UniqueName: \"kubernetes.io/projected/5f9b4a97-7207-456f-97e8-fcd7cfce8793-kube-api-access-w2qcl\") pod \"5f9b4a97-7207-456f-97e8-fcd7cfce8793\" (UID: \"5f9b4a97-7207-456f-97e8-fcd7cfce8793\") " Apr 16 18:16:51.916611 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.916480 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f9b4a97-7207-456f-97e8-fcd7cfce8793-cert\") pod \"5f9b4a97-7207-456f-97e8-fcd7cfce8793\" (UID: \"5f9b4a97-7207-456f-97e8-fcd7cfce8793\") " Apr 16 18:16:51.918613 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.918585 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9b4a97-7207-456f-97e8-fcd7cfce8793-cert" (OuterVolumeSpecName: "cert") pod "5f9b4a97-7207-456f-97e8-fcd7cfce8793" (UID: "5f9b4a97-7207-456f-97e8-fcd7cfce8793"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:16:51.918613 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.918594 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9b4a97-7207-456f-97e8-fcd7cfce8793-kube-api-access-w2qcl" (OuterVolumeSpecName: "kube-api-access-w2qcl") pod "5f9b4a97-7207-456f-97e8-fcd7cfce8793" (UID: "5f9b4a97-7207-456f-97e8-fcd7cfce8793"). InnerVolumeSpecName "kube-api-access-w2qcl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:16:51.969456 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:51.969421 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-s4v7h" Apr 16 18:16:52.017499 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:52.017441 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w2qcl\" (UniqueName: \"kubernetes.io/projected/5f9b4a97-7207-456f-97e8-fcd7cfce8793-kube-api-access-w2qcl\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:16:52.017499 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:52.017474 2576 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5f9b4a97-7207-456f-97e8-fcd7cfce8793-cert\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:16:52.087623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:52.087593 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-s4v7h"] Apr 16 18:16:52.090686 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:16:52.090658 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92b32003_ca4f_49cb_8b14_026672e56812.slice/crio-4d392accb926b8b6accb6ab14ca4295c15e6e88e18a9f5b968717aef4d4bcb3c WatchSource:0}: Error finding container 4d392accb926b8b6accb6ab14ca4295c15e6e88e18a9f5b968717aef4d4bcb3c: Status 404 returned error can't find the container with id 4d392accb926b8b6accb6ab14ca4295c15e6e88e18a9f5b968717aef4d4bcb3c Apr 16 18:16:52.214944 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:52.214901 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-s4v7h" event={"ID":"92b32003-ca4f-49cb-8b14-026672e56812","Type":"ContainerStarted","Data":"4d392accb926b8b6accb6ab14ca4295c15e6e88e18a9f5b968717aef4d4bcb3c"} Apr 16 18:16:52.215933 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:52.215913 2576 generic.go:358] "Generic (PLEG): container finished" podID="5f9b4a97-7207-456f-97e8-fcd7cfce8793" containerID="aecc76065c8a61f925ea73b2a561f8613bcb6012fb16fb63640dc21fb1a19ba7" exitCode=0 Apr 16 18:16:52.215988 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:52.215967 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-r7tnc" event={"ID":"5f9b4a97-7207-456f-97e8-fcd7cfce8793","Type":"ContainerDied","Data":"aecc76065c8a61f925ea73b2a561f8613bcb6012fb16fb63640dc21fb1a19ba7"} Apr 16 18:16:52.216029 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:52.215988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-r7tnc" event={"ID":"5f9b4a97-7207-456f-97e8-fcd7cfce8793","Type":"ContainerDied","Data":"ec7628b4d9485ed41c92ca72791c00e9f9c57d880aa1af2c061dadc8d64cc312"} Apr 16 18:16:52.216029 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:52.215996 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-65589c6846-r7tnc" Apr 16 18:16:52.216120 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:52.216003 2576 scope.go:117] "RemoveContainer" containerID="aecc76065c8a61f925ea73b2a561f8613bcb6012fb16fb63640dc21fb1a19ba7" Apr 16 18:16:52.224638 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:52.224620 2576 scope.go:117] "RemoveContainer" containerID="aecc76065c8a61f925ea73b2a561f8613bcb6012fb16fb63640dc21fb1a19ba7" Apr 16 18:16:52.224880 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:16:52.224864 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aecc76065c8a61f925ea73b2a561f8613bcb6012fb16fb63640dc21fb1a19ba7\": container with ID starting with aecc76065c8a61f925ea73b2a561f8613bcb6012fb16fb63640dc21fb1a19ba7 not found: ID does not exist" containerID="aecc76065c8a61f925ea73b2a561f8613bcb6012fb16fb63640dc21fb1a19ba7" Apr 16 18:16:52.224924 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:52.224888 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aecc76065c8a61f925ea73b2a561f8613bcb6012fb16fb63640dc21fb1a19ba7"} err="failed to get container status \"aecc76065c8a61f925ea73b2a561f8613bcb6012fb16fb63640dc21fb1a19ba7\": rpc error: code = NotFound desc = could not find container \"aecc76065c8a61f925ea73b2a561f8613bcb6012fb16fb63640dc21fb1a19ba7\": container with ID starting with aecc76065c8a61f925ea73b2a561f8613bcb6012fb16fb63640dc21fb1a19ba7 not found: ID does not exist" Apr 16 18:16:52.236572 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:52.236548 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-r7tnc"] Apr 16 18:16:52.240047 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:52.240025 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-65589c6846-r7tnc"] Apr 16 18:16:53.219865 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:53.219822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-65589c6846-s4v7h" event={"ID":"92b32003-ca4f-49cb-8b14-026672e56812","Type":"ContainerStarted","Data":"3c36acac8a40198a242d14698b231f5b0d0c990b786845b9ea5235ec86e06dc0"} Apr 16 18:16:53.220384 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:53.219934 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-65589c6846-s4v7h" Apr 16 18:16:53.235294 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:53.235237 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-65589c6846-s4v7h" podStartSLOduration=1.754154795 podStartE2EDuration="2.235221692s" podCreationTimestamp="2026-04-16 18:16:51 +0000 UTC" firstStartedPulling="2026-04-16 18:16:52.09199746 +0000 UTC m=+412.795328962" lastFinishedPulling="2026-04-16 18:16:52.573064353 +0000 UTC m=+413.276395859" observedRunningTime="2026-04-16 18:16:53.233708954 +0000 UTC m=+413.937040480" watchObservedRunningTime="2026-04-16 18:16:53.235221692 +0000 UTC m=+413.938553216" Apr 16 18:16:53.899051 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:16:53.899018 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9b4a97-7207-456f-97e8-fcd7cfce8793" path="/var/lib/kubelet/pods/5f9b4a97-7207-456f-97e8-fcd7cfce8793/volumes" Apr 16 18:17:24.229341 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:24.229263 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-65589c6846-s4v7h" Apr 16 18:17:25.607910 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:25.607874 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-nrc5q"] Apr 16 18:17:25.608310 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:25.608231 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f9b4a97-7207-456f-97e8-fcd7cfce8793" containerName="manager" Apr 16 18:17:25.608310 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:25.608256 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9b4a97-7207-456f-97e8-fcd7cfce8793" containerName="manager" Apr 16 18:17:25.608395 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:25.608332 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f9b4a97-7207-456f-97e8-fcd7cfce8793" containerName="manager" Apr 16 18:17:25.610458 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:25.610440 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-nrc5q" Apr 16 18:17:25.612744 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:25.612722 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 18:17:25.612876 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:25.612725 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-drz7b\"" Apr 16 18:17:25.620844 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:25.620818 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-nrc5q"] Apr 16 18:17:25.680578 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:25.680545 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a238435e-74d0-4e28-be75-30402f705cc8-tls-certs\") pod \"model-serving-api-86f7b4b499-nrc5q\" (UID: \"a238435e-74d0-4e28-be75-30402f705cc8\") " pod="kserve/model-serving-api-86f7b4b499-nrc5q" Apr 16 18:17:25.680578 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:25.680581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz52q\" (UniqueName: \"kubernetes.io/projected/a238435e-74d0-4e28-be75-30402f705cc8-kube-api-access-mz52q\") pod \"model-serving-api-86f7b4b499-nrc5q\" (UID: \"a238435e-74d0-4e28-be75-30402f705cc8\") " pod="kserve/model-serving-api-86f7b4b499-nrc5q" Apr 16 18:17:25.781052 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:25.781014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a238435e-74d0-4e28-be75-30402f705cc8-tls-certs\") pod \"model-serving-api-86f7b4b499-nrc5q\" (UID: \"a238435e-74d0-4e28-be75-30402f705cc8\") " pod="kserve/model-serving-api-86f7b4b499-nrc5q" Apr 16 18:17:25.781052 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:25.781055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mz52q\" (UniqueName: \"kubernetes.io/projected/a238435e-74d0-4e28-be75-30402f705cc8-kube-api-access-mz52q\") pod \"model-serving-api-86f7b4b499-nrc5q\" (UID: \"a238435e-74d0-4e28-be75-30402f705cc8\") " pod="kserve/model-serving-api-86f7b4b499-nrc5q" Apr 16 18:17:25.781338 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:17:25.781152 2576 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 18:17:25.781338 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:17:25.781214 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a238435e-74d0-4e28-be75-30402f705cc8-tls-certs podName:a238435e-74d0-4e28-be75-30402f705cc8 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:26.281198401 +0000 UTC m=+446.984529902 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/a238435e-74d0-4e28-be75-30402f705cc8-tls-certs") pod "model-serving-api-86f7b4b499-nrc5q" (UID: "a238435e-74d0-4e28-be75-30402f705cc8") : secret "model-serving-api-tls" not found Apr 16 18:17:25.789439 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:25.789398 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz52q\" (UniqueName: \"kubernetes.io/projected/a238435e-74d0-4e28-be75-30402f705cc8-kube-api-access-mz52q\") pod \"model-serving-api-86f7b4b499-nrc5q\" (UID: \"a238435e-74d0-4e28-be75-30402f705cc8\") " pod="kserve/model-serving-api-86f7b4b499-nrc5q" Apr 16 18:17:26.285256 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:26.285217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a238435e-74d0-4e28-be75-30402f705cc8-tls-certs\") pod \"model-serving-api-86f7b4b499-nrc5q\" (UID: \"a238435e-74d0-4e28-be75-30402f705cc8\") " pod="kserve/model-serving-api-86f7b4b499-nrc5q" Apr 16 18:17:26.287946 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:26.287924 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a238435e-74d0-4e28-be75-30402f705cc8-tls-certs\") pod \"model-serving-api-86f7b4b499-nrc5q\" (UID: \"a238435e-74d0-4e28-be75-30402f705cc8\") " pod="kserve/model-serving-api-86f7b4b499-nrc5q" Apr 16 18:17:26.521534 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:26.521472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-nrc5q" Apr 16 18:17:26.656925 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:26.656904 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-nrc5q"] Apr 16 18:17:26.659258 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:17:26.659218 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda238435e_74d0_4e28_be75_30402f705cc8.slice/crio-de87e2e45b234614a403107be33f0924b0d800994097ba381d6777b3ea8cf0ad WatchSource:0}: Error finding container de87e2e45b234614a403107be33f0924b0d800994097ba381d6777b3ea8cf0ad: Status 404 returned error can't find the container with id de87e2e45b234614a403107be33f0924b0d800994097ba381d6777b3ea8cf0ad Apr 16 18:17:27.336905 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:27.336866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-nrc5q" event={"ID":"a238435e-74d0-4e28-be75-30402f705cc8","Type":"ContainerStarted","Data":"de87e2e45b234614a403107be33f0924b0d800994097ba381d6777b3ea8cf0ad"} Apr 16 18:17:28.342018 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:28.341926 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-nrc5q" event={"ID":"a238435e-74d0-4e28-be75-30402f705cc8","Type":"ContainerStarted","Data":"9a1db7fe0d7f5f03b7a58fe3a5a8c1fce714a63f1a81130aae79cb72cdffc2ef"} Apr 16 18:17:28.342416 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:28.342020 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-nrc5q" Apr 16 18:17:28.359015 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:28.358972 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-nrc5q" podStartSLOduration=2.105195178 podStartE2EDuration="3.358959026s" podCreationTimestamp="2026-04-16 18:17:25 +0000 UTC" firstStartedPulling="2026-04-16 18:17:26.661451738 +0000 UTC m=+447.364783243" lastFinishedPulling="2026-04-16 18:17:27.915215585 +0000 UTC m=+448.618547091" observedRunningTime="2026-04-16 18:17:28.35801577 +0000 UTC m=+449.061347294" watchObservedRunningTime="2026-04-16 18:17:28.358959026 +0000 UTC m=+449.062290549" Apr 16 18:17:39.349718 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:39.349689 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-nrc5q" Apr 16 18:17:41.491267 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:17:41.491209 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d9f45f7f6-p679k"] Apr 16 18:18:02.310235 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.310201 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v"] Apr 16 18:18:02.313091 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.313072 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" Apr 16 18:18:02.315090 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.315068 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7d9sz\"" Apr 16 18:18:02.321003 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.320969 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v"] Apr 16 18:18:02.324397 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.324377 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" Apr 16 18:18:02.455288 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.455258 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v"] Apr 16 18:18:02.458338 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:18:02.458309 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69cd51b7_b70f_457d_bd3a_fca57bdb2b2a.slice/crio-d20e072fda1daf740ab842b792748babcd47592da03a3d1d929862b13092cb90 WatchSource:0}: Error finding container d20e072fda1daf740ab842b792748babcd47592da03a3d1d929862b13092cb90: Status 404 returned error can't find the container with id d20e072fda1daf740ab842b792748babcd47592da03a3d1d929862b13092cb90 Apr 16 18:18:02.507262 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.507218 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch"] Apr 16 18:18:02.510093 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.510070 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" Apr 16 18:18:02.517160 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.517134 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch"] Apr 16 18:18:02.521589 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.521568 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" Apr 16 18:18:02.648971 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.648948 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch"] Apr 16 18:18:02.651513 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:18:02.651481 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod390db12b_b4b7_4d09_836f_6ed116b1f1ad.slice/crio-d06d68765a6924bdfbd4d546f931ee7730450ddf2b4b4f519fa617e83319a132 WatchSource:0}: Error finding container d06d68765a6924bdfbd4d546f931ee7730450ddf2b4b4f519fa617e83319a132: Status 404 returned error can't find the container with id d06d68765a6924bdfbd4d546f931ee7730450ddf2b4b4f519fa617e83319a132 Apr 16 18:18:02.734792 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.734761 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn"] Apr 16 18:18:02.737989 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.737973 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" Apr 16 18:18:02.745734 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.745699 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn"] Apr 16 18:18:02.799667 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.799627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d6b77e-7da9-4457-916e-62779d3f7094-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn\" (UID: \"41d6b77e-7da9-4457-916e-62779d3f7094\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" Apr 16 18:18:02.900696 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.900654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d6b77e-7da9-4457-916e-62779d3f7094-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn\" (UID: \"41d6b77e-7da9-4457-916e-62779d3f7094\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" Apr 16 18:18:02.901072 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:02.901050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d6b77e-7da9-4457-916e-62779d3f7094-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn\" (UID: \"41d6b77e-7da9-4457-916e-62779d3f7094\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" Apr 16 18:18:03.049642 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:03.049311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" Apr 16 18:18:03.224964 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:03.224920 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn"] Apr 16 18:18:03.227852 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:18:03.227818 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41d6b77e_7da9_4457_916e_62779d3f7094.slice/crio-691ed4285d137c3b490f5a2409b19e88e18aab51c24e3ccca699924fbe0bb1dc WatchSource:0}: Error finding container 691ed4285d137c3b490f5a2409b19e88e18aab51c24e3ccca699924fbe0bb1dc: Status 404 returned error can't find the container with id 691ed4285d137c3b490f5a2409b19e88e18aab51c24e3ccca699924fbe0bb1dc Apr 16 18:18:03.470993 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:03.470841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" event={"ID":"390db12b-b4b7-4d09-836f-6ed116b1f1ad","Type":"ContainerStarted","Data":"d06d68765a6924bdfbd4d546f931ee7730450ddf2b4b4f519fa617e83319a132"} Apr 16 18:18:03.478237 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:03.478159 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" event={"ID":"69cd51b7-b70f-457d-bd3a-fca57bdb2b2a","Type":"ContainerStarted","Data":"d20e072fda1daf740ab842b792748babcd47592da03a3d1d929862b13092cb90"} Apr 16 18:18:03.493811 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:03.493755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" event={"ID":"41d6b77e-7da9-4457-916e-62779d3f7094","Type":"ContainerStarted","Data":"691ed4285d137c3b490f5a2409b19e88e18aab51c24e3ccca699924fbe0bb1dc"} Apr 16 18:18:06.511510 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.511443 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7d9f45f7f6-p679k" podUID="ce34c05d-d3c5-4938-86fb-d746afebb402" containerName="console" containerID="cri-o://208986e650bd12036db751558f512fdec7f2a5beef7ba4407d32c9f00b51b59f" gracePeriod=15 Apr 16 18:18:06.877621 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.877107 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d9f45f7f6-p679k_ce34c05d-d3c5-4938-86fb-d746afebb402/console/0.log" Apr 16 18:18:06.877621 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.877195 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:18:06.956058 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.954848 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-console-config\") pod \"ce34c05d-d3c5-4938-86fb-d746afebb402\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " Apr 16 18:18:06.956058 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.954914 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce34c05d-d3c5-4938-86fb-d746afebb402-console-oauth-config\") pod \"ce34c05d-d3c5-4938-86fb-d746afebb402\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " Apr 16 18:18:06.956058 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.954983 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-service-ca\") pod \"ce34c05d-d3c5-4938-86fb-d746afebb402\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " Apr 16 18:18:06.956058 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.955012 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce34c05d-d3c5-4938-86fb-d746afebb402-console-serving-cert\") pod \"ce34c05d-d3c5-4938-86fb-d746afebb402\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " Apr 16 18:18:06.956058 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.955045 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-oauth-serving-cert\") pod \"ce34c05d-d3c5-4938-86fb-d746afebb402\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " Apr 16 18:18:06.956058 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.955085 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-trusted-ca-bundle\") pod \"ce34c05d-d3c5-4938-86fb-d746afebb402\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " Apr 16 18:18:06.956058 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.955125 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxhjt\" (UniqueName: \"kubernetes.io/projected/ce34c05d-d3c5-4938-86fb-d746afebb402-kube-api-access-mxhjt\") pod \"ce34c05d-d3c5-4938-86fb-d746afebb402\" (UID: \"ce34c05d-d3c5-4938-86fb-d746afebb402\") " Apr 16 18:18:06.956556 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.956349 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-console-config" (OuterVolumeSpecName: "console-config") pod "ce34c05d-d3c5-4938-86fb-d746afebb402" (UID: "ce34c05d-d3c5-4938-86fb-d746afebb402"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:18:06.958109 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.957124 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ce34c05d-d3c5-4938-86fb-d746afebb402" (UID: "ce34c05d-d3c5-4938-86fb-d746afebb402"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:18:06.958109 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.957284 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ce34c05d-d3c5-4938-86fb-d746afebb402" (UID: "ce34c05d-d3c5-4938-86fb-d746afebb402"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:18:06.958109 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.957416 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-service-ca" (OuterVolumeSpecName: "service-ca") pod "ce34c05d-d3c5-4938-86fb-d746afebb402" (UID: "ce34c05d-d3c5-4938-86fb-d746afebb402"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:18:06.975274 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.975211 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce34c05d-d3c5-4938-86fb-d746afebb402-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ce34c05d-d3c5-4938-86fb-d746afebb402" (UID: "ce34c05d-d3c5-4938-86fb-d746afebb402"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:18:06.975389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.975306 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce34c05d-d3c5-4938-86fb-d746afebb402-kube-api-access-mxhjt" (OuterVolumeSpecName: "kube-api-access-mxhjt") pod "ce34c05d-d3c5-4938-86fb-d746afebb402" (UID: "ce34c05d-d3c5-4938-86fb-d746afebb402"). InnerVolumeSpecName "kube-api-access-mxhjt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:18:06.978122 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:06.978085 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce34c05d-d3c5-4938-86fb-d746afebb402-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ce34c05d-d3c5-4938-86fb-d746afebb402" (UID: "ce34c05d-d3c5-4938-86fb-d746afebb402"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:18:07.056164 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.055967 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mxhjt\" (UniqueName: \"kubernetes.io/projected/ce34c05d-d3c5-4938-86fb-d746afebb402-kube-api-access-mxhjt\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:18:07.056164 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.056005 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-console-config\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:18:07.056164 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.056020 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce34c05d-d3c5-4938-86fb-d746afebb402-console-oauth-config\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:18:07.056164 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.056035 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-service-ca\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:18:07.056164 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.056050 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce34c05d-d3c5-4938-86fb-d746afebb402-console-serving-cert\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:18:07.056164 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.056063 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-oauth-serving-cert\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:18:07.056164 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.056077 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce34c05d-d3c5-4938-86fb-d746afebb402-trusted-ca-bundle\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:18:07.532919 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.532317 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d9f45f7f6-p679k_ce34c05d-d3c5-4938-86fb-d746afebb402/console/0.log" Apr 16 18:18:07.532919 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.532364 2576 generic.go:358] "Generic (PLEG): container finished" podID="ce34c05d-d3c5-4938-86fb-d746afebb402" containerID="208986e650bd12036db751558f512fdec7f2a5beef7ba4407d32c9f00b51b59f" exitCode=2 Apr 16 18:18:07.532919 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.532494 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d9f45f7f6-p679k" event={"ID":"ce34c05d-d3c5-4938-86fb-d746afebb402","Type":"ContainerDied","Data":"208986e650bd12036db751558f512fdec7f2a5beef7ba4407d32c9f00b51b59f"} Apr 16 18:18:07.532919 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.532547 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d9f45f7f6-p679k" event={"ID":"ce34c05d-d3c5-4938-86fb-d746afebb402","Type":"ContainerDied","Data":"fdbeb964a7478c54dde84c635010e646ec8da4db230c01734e25fa720a53a6d6"} Apr 16 18:18:07.532919 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.532569 2576 scope.go:117] "RemoveContainer" containerID="208986e650bd12036db751558f512fdec7f2a5beef7ba4407d32c9f00b51b59f" Apr 16 18:18:07.532919 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.532791 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d9f45f7f6-p679k" Apr 16 18:18:07.568027 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.567981 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d9f45f7f6-p679k"] Apr 16 18:18:07.570634 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.570599 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d9f45f7f6-p679k"] Apr 16 18:18:07.905422 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:07.904533 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce34c05d-d3c5-4938-86fb-d746afebb402" path="/var/lib/kubelet/pods/ce34c05d-d3c5-4938-86fb-d746afebb402/volumes" Apr 16 18:18:10.033754 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:10.033592 2576 scope.go:117] "RemoveContainer" containerID="208986e650bd12036db751558f512fdec7f2a5beef7ba4407d32c9f00b51b59f" Apr 16 18:18:10.034068 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:18:10.033947 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208986e650bd12036db751558f512fdec7f2a5beef7ba4407d32c9f00b51b59f\": container with ID starting with 208986e650bd12036db751558f512fdec7f2a5beef7ba4407d32c9f00b51b59f not found: ID does not exist" containerID="208986e650bd12036db751558f512fdec7f2a5beef7ba4407d32c9f00b51b59f" Apr 16 18:18:10.034068 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:10.033987 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208986e650bd12036db751558f512fdec7f2a5beef7ba4407d32c9f00b51b59f"} err="failed to get container status \"208986e650bd12036db751558f512fdec7f2a5beef7ba4407d32c9f00b51b59f\": rpc error: code = NotFound desc = could not find container \"208986e650bd12036db751558f512fdec7f2a5beef7ba4407d32c9f00b51b59f\": container with ID starting with 208986e650bd12036db751558f512fdec7f2a5beef7ba4407d32c9f00b51b59f not found: ID does not exist" Apr 16 18:18:19.585010 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:19.584900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" event={"ID":"69cd51b7-b70f-457d-bd3a-fca57bdb2b2a","Type":"ContainerStarted","Data":"fee2c0a163819303d243896e33e10cb0f30ffc4f608c64cafc1b629f799a81d1"} Apr 16 18:18:19.585495 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:19.585090 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" Apr 16 18:18:19.586525 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:19.586497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" event={"ID":"41d6b77e-7da9-4457-916e-62779d3f7094","Type":"ContainerStarted","Data":"5f0c4aeb7ce941722d208d4b67e73fcbfd75ffde9d8126289351ed17f4154819"} Apr 16 18:18:19.586883 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:19.586827 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" podUID="69cd51b7-b70f-457d-bd3a-fca57bdb2b2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 18:18:19.587888 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:19.587864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" event={"ID":"390db12b-b4b7-4d09-836f-6ed116b1f1ad","Type":"ContainerStarted","Data":"3c34c703ece4a6143a88a678e5ff364bcbbb0503a4995526665a7c2d7e80f8e5"} Apr 16 18:18:19.588091 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:19.588074 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" Apr 16 18:18:19.589126 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:19.589102 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" podUID="390db12b-b4b7-4d09-836f-6ed116b1f1ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 18:18:19.599437 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:19.599391 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" podStartSLOduration=0.731944441 podStartE2EDuration="17.599375536s" podCreationTimestamp="2026-04-16 18:18:02 +0000 UTC" firstStartedPulling="2026-04-16 18:18:02.460360691 +0000 UTC m=+483.163692208" lastFinishedPulling="2026-04-16 18:18:19.327791787 +0000 UTC m=+500.031123303" observedRunningTime="2026-04-16 18:18:19.598981929 +0000 UTC m=+500.302313465" watchObservedRunningTime="2026-04-16 18:18:19.599375536 +0000 UTC m=+500.302707065" Apr 16 18:18:19.611701 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:19.611653 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" podStartSLOduration=0.93276666 podStartE2EDuration="17.611638213s" podCreationTimestamp="2026-04-16 18:18:02 +0000 UTC" firstStartedPulling="2026-04-16 18:18:02.65323014 +0000 UTC m=+483.356561642" lastFinishedPulling="2026-04-16 18:18:19.332101683 +0000 UTC m=+500.035433195" observedRunningTime="2026-04-16 18:18:19.610524352 +0000 UTC m=+500.313856079" watchObservedRunningTime="2026-04-16 18:18:19.611638213 +0000 UTC m=+500.314969740" Apr 16 18:18:20.591787 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:20.591743 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" podUID="390db12b-b4b7-4d09-836f-6ed116b1f1ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 18:18:20.592144 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:20.591753 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" podUID="69cd51b7-b70f-457d-bd3a-fca57bdb2b2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 18:18:23.602403 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:23.602308 2576 generic.go:358] "Generic (PLEG): container finished" podID="41d6b77e-7da9-4457-916e-62779d3f7094" containerID="5f0c4aeb7ce941722d208d4b67e73fcbfd75ffde9d8126289351ed17f4154819" exitCode=0 Apr 16 18:18:23.602403 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:23.602389 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" event={"ID":"41d6b77e-7da9-4457-916e-62779d3f7094","Type":"ContainerDied","Data":"5f0c4aeb7ce941722d208d4b67e73fcbfd75ffde9d8126289351ed17f4154819"} Apr 16 18:18:30.592399 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:30.592356 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" podUID="390db12b-b4b7-4d09-836f-6ed116b1f1ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 18:18:30.592845 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:30.592359 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" podUID="69cd51b7-b70f-457d-bd3a-fca57bdb2b2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 18:18:31.633846 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:31.633813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" event={"ID":"41d6b77e-7da9-4457-916e-62779d3f7094","Type":"ContainerStarted","Data":"d2286f78adeaf3e6ddc64006c226300fcbf3549aa93b94329db73bd24277fa9e"} Apr 16 18:18:31.634219 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:31.634098 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" Apr 16 18:18:31.635436 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:31.635413 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:18:31.649880 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:31.649829 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" podStartSLOduration=2.147892349 podStartE2EDuration="29.649816219s" podCreationTimestamp="2026-04-16 18:18:02 +0000 UTC" firstStartedPulling="2026-04-16 18:18:03.230722728 +0000 UTC m=+483.934054233" lastFinishedPulling="2026-04-16 18:18:30.732646601 +0000 UTC m=+511.435978103" observedRunningTime="2026-04-16 18:18:31.647753173 +0000 UTC m=+512.351084697" watchObservedRunningTime="2026-04-16 18:18:31.649816219 +0000 UTC m=+512.353147742" Apr 16 18:18:32.637612 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:32.637576 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:18:40.592630 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:40.592581 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" podUID="69cd51b7-b70f-457d-bd3a-fca57bdb2b2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 18:18:40.593086 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:40.592581 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" podUID="390db12b-b4b7-4d09-836f-6ed116b1f1ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 18:18:42.637785 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:42.637742 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:18:50.592641 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:50.592546 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" podUID="69cd51b7-b70f-457d-bd3a-fca57bdb2b2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 18:18:50.593020 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:50.592548 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" podUID="390db12b-b4b7-4d09-836f-6ed116b1f1ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 18:18:52.637955 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:18:52.637917 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:19:00.592498 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:00.592458 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" podUID="69cd51b7-b70f-457d-bd3a-fca57bdb2b2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 18:19:00.592899 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:00.592462 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" podUID="390db12b-b4b7-4d09-836f-6ed116b1f1ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 18:19:02.637784 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:02.637734 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:19:10.593415 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:10.593381 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" Apr 16 18:19:10.593840 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:10.593427 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" Apr 16 18:19:12.638201 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:12.638162 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:19:22.638582 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:22.638541 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:19:32.463095 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.463059 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch"] Apr 16 18:19:32.463571 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.463394 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" podUID="390db12b-b4b7-4d09-836f-6ed116b1f1ad" containerName="kserve-container" containerID="cri-o://3c34c703ece4a6143a88a678e5ff364bcbbb0503a4995526665a7c2d7e80f8e5" gracePeriod=30 Apr 16 18:19:32.514985 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.514954 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v"] Apr 16 18:19:32.515194 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.515175 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" podUID="69cd51b7-b70f-457d-bd3a-fca57bdb2b2a" containerName="kserve-container" containerID="cri-o://fee2c0a163819303d243896e33e10cb0f30ffc4f608c64cafc1b629f799a81d1" gracePeriod=30 Apr 16 18:19:32.540108 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.540081 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw"] Apr 16 18:19:32.540450 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.540437 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce34c05d-d3c5-4938-86fb-d746afebb402" containerName="console" Apr 16 18:19:32.540497 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.540452 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce34c05d-d3c5-4938-86fb-d746afebb402" containerName="console" Apr 16 18:19:32.540531 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.540515 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce34c05d-d3c5-4938-86fb-d746afebb402" containerName="console" Apr 16 18:19:32.543569 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.543551 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" Apr 16 18:19:32.549163 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.549136 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw"] Apr 16 18:19:32.554522 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.554503 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" Apr 16 18:19:32.630940 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.630906 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf"] Apr 16 18:19:32.635505 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.635480 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" Apr 16 18:19:32.638023 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.637678 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:19:32.644215 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.644188 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf"] Apr 16 18:19:32.649706 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.649661 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" Apr 16 18:19:32.688303 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.688240 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw"] Apr 16 18:19:32.781485 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.781464 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf"] Apr 16 18:19:32.783803 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:19:32.783777 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27188baf_2587_4799_8e1f_8d66cbbef954.slice/crio-3b3be2276fd886cf0719fb2dcc5dd95a0ae0364b77bdeee35e4a7cc974599454 WatchSource:0}: Error finding container 3b3be2276fd886cf0719fb2dcc5dd95a0ae0364b77bdeee35e4a7cc974599454: Status 404 returned error can't find the container with id 3b3be2276fd886cf0719fb2dcc5dd95a0ae0364b77bdeee35e4a7cc974599454 Apr 16 18:19:32.836852 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.836784 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" event={"ID":"27188baf-2587-4799-8e1f-8d66cbbef954","Type":"ContainerStarted","Data":"3b3be2276fd886cf0719fb2dcc5dd95a0ae0364b77bdeee35e4a7cc974599454"} Apr 16 18:19:32.838384 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:32.838346 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" event={"ID":"e3228849-009a-4d99-a71b-830ff0834674","Type":"ContainerStarted","Data":"350ba2d1e9568a0b74edf5cd0a2a396ae1346777dc63c817cec60500cf9f6bda"} Apr 16 18:19:33.842457 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:33.842418 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" event={"ID":"27188baf-2587-4799-8e1f-8d66cbbef954","Type":"ContainerStarted","Data":"7afa4c5a26ade5678b0dc671f703a92cf6d26c12e9e8e86de9f3604dbaa8d30c"} Apr 16 18:19:33.842931 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:33.842614 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" Apr 16 18:19:33.843651 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:33.843620 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" podUID="27188baf-2587-4799-8e1f-8d66cbbef954" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 18:19:33.843848 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:33.843825 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" event={"ID":"e3228849-009a-4d99-a71b-830ff0834674","Type":"ContainerStarted","Data":"11b4305d12f056ce570ffce4eecd2658f69a9404bff328a27839b47462a970f5"} Apr 16 18:19:33.843979 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:33.843949 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" Apr 16 18:19:33.844857 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:33.844834 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" podUID="e3228849-009a-4d99-a71b-830ff0834674" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 18:19:33.856987 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:33.856941 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" podStartSLOduration=1.856927626 podStartE2EDuration="1.856927626s" podCreationTimestamp="2026-04-16 18:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:33.856297772 +0000 UTC m=+574.559629297" watchObservedRunningTime="2026-04-16 18:19:33.856927626 +0000 UTC m=+574.560259150" Apr 16 18:19:33.870157 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:33.870114 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" podStartSLOduration=1.8701022919999999 podStartE2EDuration="1.870102292s" podCreationTimestamp="2026-04-16 18:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:33.868730912 +0000 UTC m=+574.572062427" watchObservedRunningTime="2026-04-16 18:19:33.870102292 +0000 UTC m=+574.573433816" Apr 16 18:19:34.848318 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:34.848279 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" podUID="27188baf-2587-4799-8e1f-8d66cbbef954" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 18:19:34.848318 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:34.848294 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" podUID="e3228849-009a-4d99-a71b-830ff0834674" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 18:19:35.783977 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.783921 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" Apr 16 18:19:35.814404 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.814379 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" Apr 16 18:19:35.853053 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.853008 2576 generic.go:358] "Generic (PLEG): container finished" podID="69cd51b7-b70f-457d-bd3a-fca57bdb2b2a" containerID="fee2c0a163819303d243896e33e10cb0f30ffc4f608c64cafc1b629f799a81d1" exitCode=0 Apr 16 18:19:35.853499 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.853074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" event={"ID":"69cd51b7-b70f-457d-bd3a-fca57bdb2b2a","Type":"ContainerDied","Data":"fee2c0a163819303d243896e33e10cb0f30ffc4f608c64cafc1b629f799a81d1"} Apr 16 18:19:35.853499 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.853094 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" Apr 16 18:19:35.853499 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.853128 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v" event={"ID":"69cd51b7-b70f-457d-bd3a-fca57bdb2b2a","Type":"ContainerDied","Data":"d20e072fda1daf740ab842b792748babcd47592da03a3d1d929862b13092cb90"} Apr 16 18:19:35.853499 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.853149 2576 scope.go:117] "RemoveContainer" containerID="fee2c0a163819303d243896e33e10cb0f30ffc4f608c64cafc1b629f799a81d1" Apr 16 18:19:35.854509 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.854487 2576 generic.go:358] "Generic (PLEG): container finished" podID="390db12b-b4b7-4d09-836f-6ed116b1f1ad" containerID="3c34c703ece4a6143a88a678e5ff364bcbbb0503a4995526665a7c2d7e80f8e5" exitCode=0 Apr 16 18:19:35.854603 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.854552 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" event={"ID":"390db12b-b4b7-4d09-836f-6ed116b1f1ad","Type":"ContainerDied","Data":"3c34c703ece4a6143a88a678e5ff364bcbbb0503a4995526665a7c2d7e80f8e5"} Apr 16 18:19:35.854603 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.854559 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" Apr 16 18:19:35.854603 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.854578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch" event={"ID":"390db12b-b4b7-4d09-836f-6ed116b1f1ad","Type":"ContainerDied","Data":"d06d68765a6924bdfbd4d546f931ee7730450ddf2b4b4f519fa617e83319a132"} Apr 16 18:19:35.863143 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.863119 2576 scope.go:117] "RemoveContainer" containerID="fee2c0a163819303d243896e33e10cb0f30ffc4f608c64cafc1b629f799a81d1" Apr 16 18:19:35.863837 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:19:35.863810 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee2c0a163819303d243896e33e10cb0f30ffc4f608c64cafc1b629f799a81d1\": container with ID starting with fee2c0a163819303d243896e33e10cb0f30ffc4f608c64cafc1b629f799a81d1 not found: ID does not exist" containerID="fee2c0a163819303d243896e33e10cb0f30ffc4f608c64cafc1b629f799a81d1" Apr 16 18:19:35.863934 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.863855 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee2c0a163819303d243896e33e10cb0f30ffc4f608c64cafc1b629f799a81d1"} err="failed to get container status \"fee2c0a163819303d243896e33e10cb0f30ffc4f608c64cafc1b629f799a81d1\": rpc error: code = NotFound desc = could not find container \"fee2c0a163819303d243896e33e10cb0f30ffc4f608c64cafc1b629f799a81d1\": container with ID starting with fee2c0a163819303d243896e33e10cb0f30ffc4f608c64cafc1b629f799a81d1 not found: ID does not exist" Apr 16 18:19:35.863934 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.863883 2576 scope.go:117] "RemoveContainer" containerID="3c34c703ece4a6143a88a678e5ff364bcbbb0503a4995526665a7c2d7e80f8e5" Apr 16 18:19:35.874370 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.874348 2576 scope.go:117] "RemoveContainer" containerID="3c34c703ece4a6143a88a678e5ff364bcbbb0503a4995526665a7c2d7e80f8e5" Apr 16 18:19:35.874836 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:19:35.874816 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c34c703ece4a6143a88a678e5ff364bcbbb0503a4995526665a7c2d7e80f8e5\": container with ID starting with 3c34c703ece4a6143a88a678e5ff364bcbbb0503a4995526665a7c2d7e80f8e5 not found: ID does not exist" containerID="3c34c703ece4a6143a88a678e5ff364bcbbb0503a4995526665a7c2d7e80f8e5" Apr 16 18:19:35.874882 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.874846 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c34c703ece4a6143a88a678e5ff364bcbbb0503a4995526665a7c2d7e80f8e5"} err="failed to get container status \"3c34c703ece4a6143a88a678e5ff364bcbbb0503a4995526665a7c2d7e80f8e5\": rpc error: code = NotFound desc = could not find container \"3c34c703ece4a6143a88a678e5ff364bcbbb0503a4995526665a7c2d7e80f8e5\": container with ID starting with 3c34c703ece4a6143a88a678e5ff364bcbbb0503a4995526665a7c2d7e80f8e5 not found: ID does not exist" Apr 16 18:19:35.877983 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.877961 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch"] Apr 16 18:19:35.881299 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.881282 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-47091-predictor-5d9486d6d7-pwtch"] Apr 16 18:19:35.889828 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.889802 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v"] Apr 16 18:19:35.893877 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.893857 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-47091-predictor-7c875766fd-97c5v"] Apr 16 18:19:35.898672 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.898647 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="390db12b-b4b7-4d09-836f-6ed116b1f1ad" path="/var/lib/kubelet/pods/390db12b-b4b7-4d09-836f-6ed116b1f1ad/volumes" Apr 16 18:19:35.898876 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:35.898863 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69cd51b7-b70f-457d-bd3a-fca57bdb2b2a" path="/var/lib/kubelet/pods/69cd51b7-b70f-457d-bd3a-fca57bdb2b2a/volumes" Apr 16 18:19:41.899059 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:41.899033 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" Apr 16 18:19:44.848753 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:44.848711 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" podUID="27188baf-2587-4799-8e1f-8d66cbbef954" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 18:19:44.849197 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:44.848718 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" podUID="e3228849-009a-4d99-a71b-830ff0834674" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 18:19:54.848357 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:54.848302 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" podUID="27188baf-2587-4799-8e1f-8d66cbbef954" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 18:19:54.848753 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:19:54.848515 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" podUID="e3228849-009a-4d99-a71b-830ff0834674" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 18:20:04.848589 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:04.848543 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" podUID="27188baf-2587-4799-8e1f-8d66cbbef954" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 18:20:04.851141 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:04.848537 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" podUID="e3228849-009a-4d99-a71b-830ff0834674" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 18:20:14.848887 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:14.848800 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" podUID="27188baf-2587-4799-8e1f-8d66cbbef954" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 18:20:14.849283 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:14.848805 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" podUID="e3228849-009a-4d99-a71b-830ff0834674" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 18:20:22.403035 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.402999 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn"] Apr 16 18:20:22.403495 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.403373 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="kserve-container" containerID="cri-o://d2286f78adeaf3e6ddc64006c226300fcbf3549aa93b94329db73bd24277fa9e" gracePeriod=30 Apr 16 18:20:22.438195 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.438159 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx"] Apr 16 18:20:22.438690 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.438672 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69cd51b7-b70f-457d-bd3a-fca57bdb2b2a" containerName="kserve-container" Apr 16 18:20:22.438752 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.438694 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cd51b7-b70f-457d-bd3a-fca57bdb2b2a" containerName="kserve-container" Apr 16 18:20:22.438752 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.438737 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="390db12b-b4b7-4d09-836f-6ed116b1f1ad" containerName="kserve-container" Apr 16 18:20:22.438752 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.438746 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="390db12b-b4b7-4d09-836f-6ed116b1f1ad" containerName="kserve-container" Apr 16 18:20:22.438845 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.438839 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="390db12b-b4b7-4d09-836f-6ed116b1f1ad" containerName="kserve-container" Apr 16 18:20:22.438878 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.438856 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="69cd51b7-b70f-457d-bd3a-fca57bdb2b2a" containerName="kserve-container" Apr 16 18:20:22.442295 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.442272 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" Apr 16 18:20:22.451623 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.451596 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx"] Apr 16 18:20:22.453966 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.453946 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" Apr 16 18:20:22.529112 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.529080 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw"] Apr 16 18:20:22.532656 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.532640 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" Apr 16 18:20:22.540019 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.539684 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw"] Apr 16 18:20:22.547692 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.547673 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" Apr 16 18:20:22.589093 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.589021 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx"] Apr 16 18:20:22.594847 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.594765 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:20:22.679028 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:22.679001 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw"] Apr 16 18:20:22.681179 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:20:22.681153 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcd952a4_c57d_4b6d_be26_6984769def9a.slice/crio-79a1978b5cb87382c34561a4a69e2a24f483b0805144aff517ad974ba1be6716 WatchSource:0}: Error finding container 79a1978b5cb87382c34561a4a69e2a24f483b0805144aff517ad974ba1be6716: Status 404 returned error can't find the container with id 79a1978b5cb87382c34561a4a69e2a24f483b0805144aff517ad974ba1be6716 Apr 16 18:20:23.011482 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:23.011390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" event={"ID":"fcd952a4-c57d-4b6d-be26-6984769def9a","Type":"ContainerStarted","Data":"f974d5b7efcd0d935ef3dc36b7938e8e78df29df1345b0fb24b0366a751a2bf2"} Apr 16 18:20:23.011482 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:23.011427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" event={"ID":"fcd952a4-c57d-4b6d-be26-6984769def9a","Type":"ContainerStarted","Data":"79a1978b5cb87382c34561a4a69e2a24f483b0805144aff517ad974ba1be6716"} Apr 16 18:20:23.011709 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:23.011677 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" Apr 16 18:20:23.012675 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:23.012651 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" event={"ID":"8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27","Type":"ContainerStarted","Data":"c1140b7b91baab608958670ef7c40003bf95082d681f54fdf3fd686da7e45092"} Apr 16 18:20:23.012805 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:23.012680 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" event={"ID":"8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27","Type":"ContainerStarted","Data":"e6b15d33f2e9b3c29b2391dd1621d7f0c67120c6bcd40d029e305bb19acd0089"} Apr 16 18:20:23.012805 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:23.012796 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" podUID="fcd952a4-c57d-4b6d-be26-6984769def9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 18:20:23.012921 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:23.012846 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" Apr 16 18:20:23.013743 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:23.013723 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" podUID="8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 18:20:23.027035 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:23.026979 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" podStartSLOduration=1.026960637 podStartE2EDuration="1.026960637s" podCreationTimestamp="2026-04-16 18:20:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:23.024429692 +0000 UTC m=+623.727761217" watchObservedRunningTime="2026-04-16 18:20:23.026960637 +0000 UTC m=+623.730292162" Apr 16 18:20:23.036627 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:23.036576 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" podStartSLOduration=1.036563586 podStartE2EDuration="1.036563586s" podCreationTimestamp="2026-04-16 18:20:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:23.03651699 +0000 UTC m=+623.739848511" watchObservedRunningTime="2026-04-16 18:20:23.036563586 +0000 UTC m=+623.739895111" Apr 16 18:20:24.016575 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:24.016535 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" podUID="fcd952a4-c57d-4b6d-be26-6984769def9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 18:20:24.016575 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:24.016555 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" podUID="8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 18:20:24.849768 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:24.849735 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" Apr 16 18:20:24.850225 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:24.850203 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" Apr 16 18:20:26.839397 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:26.839370 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" Apr 16 18:20:26.907229 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:26.907141 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d6b77e-7da9-4457-916e-62779d3f7094-kserve-provision-location\") pod \"41d6b77e-7da9-4457-916e-62779d3f7094\" (UID: \"41d6b77e-7da9-4457-916e-62779d3f7094\") " Apr 16 18:20:26.907519 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:26.907495 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41d6b77e-7da9-4457-916e-62779d3f7094-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "41d6b77e-7da9-4457-916e-62779d3f7094" (UID: "41d6b77e-7da9-4457-916e-62779d3f7094"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:27.008466 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.008436 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d6b77e-7da9-4457-916e-62779d3f7094-kserve-provision-location\") on node \"ip-10-0-143-51.ec2.internal\" DevicePath \"\"" Apr 16 18:20:27.029655 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.029626 2576 generic.go:358] "Generic (PLEG): container finished" podID="41d6b77e-7da9-4457-916e-62779d3f7094" containerID="d2286f78adeaf3e6ddc64006c226300fcbf3549aa93b94329db73bd24277fa9e" exitCode=0 Apr 16 18:20:27.029827 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.029670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" event={"ID":"41d6b77e-7da9-4457-916e-62779d3f7094","Type":"ContainerDied","Data":"d2286f78adeaf3e6ddc64006c226300fcbf3549aa93b94329db73bd24277fa9e"} Apr 16 18:20:27.029827 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.029698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" event={"ID":"41d6b77e-7da9-4457-916e-62779d3f7094","Type":"ContainerDied","Data":"691ed4285d137c3b490f5a2409b19e88e18aab51c24e3ccca699924fbe0bb1dc"} Apr 16 18:20:27.029827 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.029717 2576 scope.go:117] "RemoveContainer" containerID="d2286f78adeaf3e6ddc64006c226300fcbf3549aa93b94329db73bd24277fa9e" Apr 16 18:20:27.029827 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.029715 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn" Apr 16 18:20:27.037896 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.037879 2576 scope.go:117] "RemoveContainer" containerID="5f0c4aeb7ce941722d208d4b67e73fcbfd75ffde9d8126289351ed17f4154819" Apr 16 18:20:27.046513 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.046492 2576 scope.go:117] "RemoveContainer" containerID="d2286f78adeaf3e6ddc64006c226300fcbf3549aa93b94329db73bd24277fa9e" Apr 16 18:20:27.046812 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:20:27.046789 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2286f78adeaf3e6ddc64006c226300fcbf3549aa93b94329db73bd24277fa9e\": container with ID starting with d2286f78adeaf3e6ddc64006c226300fcbf3549aa93b94329db73bd24277fa9e not found: ID does not exist" containerID="d2286f78adeaf3e6ddc64006c226300fcbf3549aa93b94329db73bd24277fa9e" Apr 16 18:20:27.046882 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.046822 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2286f78adeaf3e6ddc64006c226300fcbf3549aa93b94329db73bd24277fa9e"} err="failed to get container status \"d2286f78adeaf3e6ddc64006c226300fcbf3549aa93b94329db73bd24277fa9e\": rpc error: code = NotFound desc = could not find container \"d2286f78adeaf3e6ddc64006c226300fcbf3549aa93b94329db73bd24277fa9e\": container with ID starting with d2286f78adeaf3e6ddc64006c226300fcbf3549aa93b94329db73bd24277fa9e not found: ID does not exist" Apr 16 18:20:27.046882 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.046840 2576 scope.go:117] "RemoveContainer" containerID="5f0c4aeb7ce941722d208d4b67e73fcbfd75ffde9d8126289351ed17f4154819" Apr 16 18:20:27.047116 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:20:27.047098 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f0c4aeb7ce941722d208d4b67e73fcbfd75ffde9d8126289351ed17f4154819\": container with ID starting with 5f0c4aeb7ce941722d208d4b67e73fcbfd75ffde9d8126289351ed17f4154819 not found: ID does not exist" containerID="5f0c4aeb7ce941722d208d4b67e73fcbfd75ffde9d8126289351ed17f4154819" Apr 16 18:20:27.047170 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.047125 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f0c4aeb7ce941722d208d4b67e73fcbfd75ffde9d8126289351ed17f4154819"} err="failed to get container status \"5f0c4aeb7ce941722d208d4b67e73fcbfd75ffde9d8126289351ed17f4154819\": rpc error: code = NotFound desc = could not find container \"5f0c4aeb7ce941722d208d4b67e73fcbfd75ffde9d8126289351ed17f4154819\": container with ID starting with 5f0c4aeb7ce941722d208d4b67e73fcbfd75ffde9d8126289351ed17f4154819 not found: ID does not exist" Apr 16 18:20:27.053124 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.053099 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn"] Apr 16 18:20:27.056215 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.056194 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7c54695cb6-zhrvn"] Apr 16 18:20:27.899414 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:27.899384 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" path="/var/lib/kubelet/pods/41d6b77e-7da9-4457-916e-62779d3f7094/volumes" Apr 16 18:20:34.017375 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:34.017328 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" podUID="fcd952a4-c57d-4b6d-be26-6984769def9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 18:20:34.017750 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:34.017328 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" podUID="8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 18:20:44.017512 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:44.017474 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" podUID="fcd952a4-c57d-4b6d-be26-6984769def9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 18:20:44.017858 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:44.017476 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" podUID="8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 18:20:54.017156 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:54.017113 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" podUID="fcd952a4-c57d-4b6d-be26-6984769def9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 18:20:54.017626 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:20:54.017114 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" podUID="8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 18:21:04.017482 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:21:04.017436 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" podUID="fcd952a4-c57d-4b6d-be26-6984769def9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 18:21:04.017943 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:21:04.017436 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" podUID="8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 18:21:14.018410 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:21:14.018381 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" Apr 16 18:21:14.018801 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:21:14.018430 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" Apr 16 18:28:57.514532 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.514498 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw"] Apr 16 18:28:57.516389 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.514799 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" podUID="e3228849-009a-4d99-a71b-830ff0834674" containerName="kserve-container" containerID="cri-o://11b4305d12f056ce570ffce4eecd2658f69a9404bff328a27839b47462a970f5" gracePeriod=30 Apr 16 18:28:57.583458 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.583420 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg"] Apr 16 18:28:57.583852 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.583837 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="kserve-container" Apr 16 18:28:57.583907 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.583854 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="kserve-container" Apr 16 18:28:57.583907 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.583867 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="storage-initializer" Apr 16 18:28:57.583907 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.583874 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="storage-initializer" Apr 16 18:28:57.584013 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.583939 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="41d6b77e-7da9-4457-916e-62779d3f7094" containerName="kserve-container" Apr 16 18:28:57.586291 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.586272 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" Apr 16 18:28:57.589839 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.589809 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf"] Apr 16 18:28:57.590136 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.590094 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" podUID="27188baf-2587-4799-8e1f-8d66cbbef954" containerName="kserve-container" containerID="cri-o://7afa4c5a26ade5678b0dc671f703a92cf6d26c12e9e8e86de9f3604dbaa8d30c" gracePeriod=30 Apr 16 18:28:57.597938 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.597919 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" Apr 16 18:28:57.599431 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.599406 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg"] Apr 16 18:28:57.639030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.638992 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627"] Apr 16 18:28:57.642393 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.642371 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" Apr 16 18:28:57.648774 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.648750 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627"] Apr 16 18:28:57.655951 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.655873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" Apr 16 18:28:57.744536 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.744511 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg"] Apr 16 18:28:57.747435 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:28:57.747405 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8020656a_80c3_4415_9248_4c823e247e60.slice/crio-1f10247271730f7fc5fda910b2580f33a054c676b81cf03e02f3ed568bd6d8c6 WatchSource:0}: Error finding container 1f10247271730f7fc5fda910b2580f33a054c676b81cf03e02f3ed568bd6d8c6: Status 404 returned error can't find the container with id 1f10247271730f7fc5fda910b2580f33a054c676b81cf03e02f3ed568bd6d8c6 Apr 16 18:28:57.749841 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.749813 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:28:57.793040 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:57.793015 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627"] Apr 16 18:28:57.795651 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:28:57.795626 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod620cdff5_ebb8_4fce_9e06_726bda85841e.slice/crio-31ce64bc26469e9254200831a45fdcad88b87e4020665717bf710cdb8648fb85 WatchSource:0}: Error finding container 31ce64bc26469e9254200831a45fdcad88b87e4020665717bf710cdb8648fb85: Status 404 returned error can't find the container with id 31ce64bc26469e9254200831a45fdcad88b87e4020665717bf710cdb8648fb85 Apr 16 18:28:58.680297 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:58.680264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" event={"ID":"620cdff5-ebb8-4fce-9e06-726bda85841e","Type":"ContainerStarted","Data":"5ba4f3f330ce75eb7e73769941a2825f7f92ad21aa035d85e08e654e9c8f0795"} Apr 16 18:28:58.680734 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:58.680305 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" event={"ID":"620cdff5-ebb8-4fce-9e06-726bda85841e","Type":"ContainerStarted","Data":"31ce64bc26469e9254200831a45fdcad88b87e4020665717bf710cdb8648fb85"} Apr 16 18:28:58.680734 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:58.680520 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" Apr 16 18:28:58.681709 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:58.681683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" event={"ID":"8020656a-80c3-4415-9248-4c823e247e60","Type":"ContainerStarted","Data":"2228802969c264d2debf356ec55dd86d57d4f8dd0b29bb40284518f8f9f26d97"} Apr 16 18:28:58.681787 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:58.681714 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" event={"ID":"8020656a-80c3-4415-9248-4c823e247e60","Type":"ContainerStarted","Data":"1f10247271730f7fc5fda910b2580f33a054c676b81cf03e02f3ed568bd6d8c6"} Apr 16 18:28:58.681837 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:58.681790 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" podUID="620cdff5-ebb8-4fce-9e06-726bda85841e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 18:28:58.681913 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:58.681898 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" Apr 16 18:28:58.682954 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:58.682927 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" podUID="8020656a-80c3-4415-9248-4c823e247e60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:28:58.695435 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:58.695399 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" podStartSLOduration=1.695386702 podStartE2EDuration="1.695386702s" podCreationTimestamp="2026-04-16 18:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:28:58.694669545 +0000 UTC m=+1139.398001091" watchObservedRunningTime="2026-04-16 18:28:58.695386702 +0000 UTC m=+1139.398718260" Apr 16 18:28:58.708082 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:58.708041 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" podStartSLOduration=1.708028402 podStartE2EDuration="1.708028402s" podCreationTimestamp="2026-04-16 18:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:28:58.7065406 +0000 UTC m=+1139.409872124" watchObservedRunningTime="2026-04-16 18:28:58.708028402 +0000 UTC m=+1139.411359926" Apr 16 18:28:59.685649 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:59.685605 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" podUID="8020656a-80c3-4415-9248-4c823e247e60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:28:59.686013 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:28:59.685606 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" podUID="620cdff5-ebb8-4fce-9e06-726bda85841e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 18:29:00.690083 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:00.690049 2576 generic.go:358] "Generic (PLEG): container finished" podID="e3228849-009a-4d99-a71b-830ff0834674" containerID="11b4305d12f056ce570ffce4eecd2658f69a9404bff328a27839b47462a970f5" exitCode=0 Apr 16 18:29:00.690441 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:00.690119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" event={"ID":"e3228849-009a-4d99-a71b-830ff0834674","Type":"ContainerDied","Data":"11b4305d12f056ce570ffce4eecd2658f69a9404bff328a27839b47462a970f5"} Apr 16 18:29:00.767357 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:00.767329 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" Apr 16 18:29:00.833387 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:00.833363 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" Apr 16 18:29:01.694932 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.694897 2576 generic.go:358] "Generic (PLEG): container finished" podID="27188baf-2587-4799-8e1f-8d66cbbef954" containerID="7afa4c5a26ade5678b0dc671f703a92cf6d26c12e9e8e86de9f3604dbaa8d30c" exitCode=0 Apr 16 18:29:01.695368 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.694966 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" Apr 16 18:29:01.695368 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.694980 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" event={"ID":"27188baf-2587-4799-8e1f-8d66cbbef954","Type":"ContainerDied","Data":"7afa4c5a26ade5678b0dc671f703a92cf6d26c12e9e8e86de9f3604dbaa8d30c"} Apr 16 18:29:01.695368 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.695028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf" event={"ID":"27188baf-2587-4799-8e1f-8d66cbbef954","Type":"ContainerDied","Data":"3b3be2276fd886cf0719fb2dcc5dd95a0ae0364b77bdeee35e4a7cc974599454"} Apr 16 18:29:01.695368 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.695051 2576 scope.go:117] "RemoveContainer" containerID="7afa4c5a26ade5678b0dc671f703a92cf6d26c12e9e8e86de9f3604dbaa8d30c" Apr 16 18:29:01.696290 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.696228 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" Apr 16 18:29:01.696290 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.696230 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw" event={"ID":"e3228849-009a-4d99-a71b-830ff0834674","Type":"ContainerDied","Data":"350ba2d1e9568a0b74edf5cd0a2a396ae1346777dc63c817cec60500cf9f6bda"} Apr 16 18:29:01.703207 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.703191 2576 scope.go:117] "RemoveContainer" containerID="7afa4c5a26ade5678b0dc671f703a92cf6d26c12e9e8e86de9f3604dbaa8d30c" Apr 16 18:29:01.703486 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:29:01.703465 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7afa4c5a26ade5678b0dc671f703a92cf6d26c12e9e8e86de9f3604dbaa8d30c\": container with ID starting with 7afa4c5a26ade5678b0dc671f703a92cf6d26c12e9e8e86de9f3604dbaa8d30c not found: ID does not exist" containerID="7afa4c5a26ade5678b0dc671f703a92cf6d26c12e9e8e86de9f3604dbaa8d30c" Apr 16 18:29:01.703554 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.703494 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afa4c5a26ade5678b0dc671f703a92cf6d26c12e9e8e86de9f3604dbaa8d30c"} err="failed to get container status \"7afa4c5a26ade5678b0dc671f703a92cf6d26c12e9e8e86de9f3604dbaa8d30c\": rpc error: code = NotFound desc = could not find container \"7afa4c5a26ade5678b0dc671f703a92cf6d26c12e9e8e86de9f3604dbaa8d30c\": container with ID starting with 7afa4c5a26ade5678b0dc671f703a92cf6d26c12e9e8e86de9f3604dbaa8d30c not found: ID does not exist" Apr 16 18:29:01.703554 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.703510 2576 scope.go:117] "RemoveContainer" containerID="11b4305d12f056ce570ffce4eecd2658f69a9404bff328a27839b47462a970f5" Apr 16 18:29:01.721777 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.721753 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf"] Apr 16 18:29:01.725231 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.725212 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-14a0c-predictor-78c89c94f9-wnmmf"] Apr 16 18:29:01.734273 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.734224 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw"] Apr 16 18:29:01.740000 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.739977 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-14a0c-predictor-6fd9f9f998-6gdkw"] Apr 16 18:29:01.898972 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.898941 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27188baf-2587-4799-8e1f-8d66cbbef954" path="/var/lib/kubelet/pods/27188baf-2587-4799-8e1f-8d66cbbef954/volumes" Apr 16 18:29:01.899183 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:01.899170 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3228849-009a-4d99-a71b-830ff0834674" path="/var/lib/kubelet/pods/e3228849-009a-4d99-a71b-830ff0834674/volumes" Apr 16 18:29:09.685999 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:09.685951 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" podUID="620cdff5-ebb8-4fce-9e06-726bda85841e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 18:29:09.686415 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:09.685951 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" podUID="8020656a-80c3-4415-9248-4c823e247e60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:29:19.685715 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:19.685614 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" podUID="620cdff5-ebb8-4fce-9e06-726bda85841e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 18:29:19.686112 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:19.685827 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" podUID="8020656a-80c3-4415-9248-4c823e247e60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:29:29.686347 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:29.686299 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" podUID="8020656a-80c3-4415-9248-4c823e247e60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:29:29.686821 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:29.686299 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" podUID="620cdff5-ebb8-4fce-9e06-726bda85841e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 18:29:39.686310 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:39.686231 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" podUID="620cdff5-ebb8-4fce-9e06-726bda85841e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 18:29:39.686719 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:39.686231 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" podUID="8020656a-80c3-4415-9248-4c823e247e60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:29:47.447241 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.447196 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx"] Apr 16 18:29:47.447722 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.447518 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" podUID="8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27" containerName="kserve-container" containerID="cri-o://c1140b7b91baab608958670ef7c40003bf95082d681f54fdf3fd686da7e45092" gracePeriod=30 Apr 16 18:29:47.484370 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.484341 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4"] Apr 16 18:29:47.484690 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.484679 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27188baf-2587-4799-8e1f-8d66cbbef954" containerName="kserve-container" Apr 16 18:29:47.484738 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.484693 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="27188baf-2587-4799-8e1f-8d66cbbef954" containerName="kserve-container" Apr 16 18:29:47.484738 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.484701 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3228849-009a-4d99-a71b-830ff0834674" containerName="kserve-container" Apr 16 18:29:47.484738 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.484707 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3228849-009a-4d99-a71b-830ff0834674" containerName="kserve-container" Apr 16 18:29:47.484828 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.484757 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e3228849-009a-4d99-a71b-830ff0834674" containerName="kserve-container" Apr 16 18:29:47.484828 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.484767 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="27188baf-2587-4799-8e1f-8d66cbbef954" containerName="kserve-container" Apr 16 18:29:47.487615 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.487596 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" Apr 16 18:29:47.496485 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.496458 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4"] Apr 16 18:29:47.498143 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.498125 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" Apr 16 18:29:47.555146 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.555114 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw"] Apr 16 18:29:47.555509 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.555450 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" podUID="fcd952a4-c57d-4b6d-be26-6984769def9a" containerName="kserve-container" containerID="cri-o://f974d5b7efcd0d935ef3dc36b7938e8e78df29df1345b0fb24b0366a751a2bf2" gracePeriod=30 Apr 16 18:29:47.590420 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.589553 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh"] Apr 16 18:29:47.594503 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.594473 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" Apr 16 18:29:47.607441 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.607419 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" Apr 16 18:29:47.608196 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.608042 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh"] Apr 16 18:29:47.645443 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.645413 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4"] Apr 16 18:29:47.647307 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:29:47.647273 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod043d90df_836c_46a3_9145_e88f9c1e786f.slice/crio-622f6eab4325119b293da21b51cec80c64200323ea54bb44ae316bb2de29b4c0 WatchSource:0}: Error finding container 622f6eab4325119b293da21b51cec80c64200323ea54bb44ae316bb2de29b4c0: Status 404 returned error can't find the container with id 622f6eab4325119b293da21b51cec80c64200323ea54bb44ae316bb2de29b4c0 Apr 16 18:29:47.737421 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.737396 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh"] Apr 16 18:29:47.740434 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:29:47.740411 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3470a1d_471d_46f5_bfa1_174b9c19e40d.slice/crio-58f2a07785622d1be3ba9bc39df2e1ce6f9dd3cfc5d5bcad149ab0ba9870182b WatchSource:0}: Error finding container 58f2a07785622d1be3ba9bc39df2e1ce6f9dd3cfc5d5bcad149ab0ba9870182b: Status 404 returned error can't find the container with id 58f2a07785622d1be3ba9bc39df2e1ce6f9dd3cfc5d5bcad149ab0ba9870182b Apr 16 18:29:47.853090 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.853058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" event={"ID":"043d90df-836c-46a3-9145-e88f9c1e786f","Type":"ContainerStarted","Data":"92f5948b68182f02c9add96d24533c13aa6cf98d6e1f84f62e200e4543ecc124"} Apr 16 18:29:47.853277 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.853111 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" Apr 16 18:29:47.853277 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.853127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" event={"ID":"043d90df-836c-46a3-9145-e88f9c1e786f","Type":"ContainerStarted","Data":"622f6eab4325119b293da21b51cec80c64200323ea54bb44ae316bb2de29b4c0"} Apr 16 18:29:47.854414 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.854387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" event={"ID":"f3470a1d-471d-46f5-bfa1-174b9c19e40d","Type":"ContainerStarted","Data":"937f30ea2318f857913d3406557a3ae19bee5103a8c1d4e710d8042f20100845"} Apr 16 18:29:47.854526 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.854421 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" event={"ID":"f3470a1d-471d-46f5-bfa1-174b9c19e40d","Type":"ContainerStarted","Data":"58f2a07785622d1be3ba9bc39df2e1ce6f9dd3cfc5d5bcad149ab0ba9870182b"} Apr 16 18:29:47.854584 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.854570 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" Apr 16 18:29:47.854868 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.854836 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" podUID="043d90df-836c-46a3-9145-e88f9c1e786f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:29:47.855517 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.855494 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" podUID="f3470a1d-471d-46f5-bfa1-174b9c19e40d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:29:47.876691 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.876644 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" podStartSLOduration=0.876629018 podStartE2EDuration="876.629018ms" podCreationTimestamp="2026-04-16 18:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:29:47.874736812 +0000 UTC m=+1188.578068336" watchObservedRunningTime="2026-04-16 18:29:47.876629018 +0000 UTC m=+1188.579960538" Apr 16 18:29:47.896007 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:47.895950 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" podStartSLOduration=0.895930983 podStartE2EDuration="895.930983ms" podCreationTimestamp="2026-04-16 18:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:29:47.895303533 +0000 UTC m=+1188.598635059" watchObservedRunningTime="2026-04-16 18:29:47.895930983 +0000 UTC m=+1188.599262508" Apr 16 18:29:48.857930 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:48.857883 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" podUID="f3470a1d-471d-46f5-bfa1-174b9c19e40d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:29:48.858414 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:48.857999 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" podUID="043d90df-836c-46a3-9145-e88f9c1e786f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:29:49.686476 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:49.686442 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" Apr 16 18:29:49.687004 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:49.686986 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" Apr 16 18:29:50.700628 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.700603 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" Apr 16 18:29:50.804919 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.804896 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" Apr 16 18:29:50.870100 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.870071 2576 generic.go:358] "Generic (PLEG): container finished" podID="8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27" containerID="c1140b7b91baab608958670ef7c40003bf95082d681f54fdf3fd686da7e45092" exitCode=0 Apr 16 18:29:50.870305 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.870129 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" Apr 16 18:29:50.870305 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.870158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" event={"ID":"8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27","Type":"ContainerDied","Data":"c1140b7b91baab608958670ef7c40003bf95082d681f54fdf3fd686da7e45092"} Apr 16 18:29:50.870305 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.870211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx" event={"ID":"8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27","Type":"ContainerDied","Data":"e6b15d33f2e9b3c29b2391dd1621d7f0c67120c6bcd40d029e305bb19acd0089"} Apr 16 18:29:50.870305 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.870228 2576 scope.go:117] "RemoveContainer" containerID="c1140b7b91baab608958670ef7c40003bf95082d681f54fdf3fd686da7e45092" Apr 16 18:29:50.871407 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.871380 2576 generic.go:358] "Generic (PLEG): container finished" podID="fcd952a4-c57d-4b6d-be26-6984769def9a" containerID="f974d5b7efcd0d935ef3dc36b7938e8e78df29df1345b0fb24b0366a751a2bf2" exitCode=0 Apr 16 18:29:50.871483 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.871412 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" event={"ID":"fcd952a4-c57d-4b6d-be26-6984769def9a","Type":"ContainerDied","Data":"f974d5b7efcd0d935ef3dc36b7938e8e78df29df1345b0fb24b0366a751a2bf2"} Apr 16 18:29:50.871483 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.871440 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" event={"ID":"fcd952a4-c57d-4b6d-be26-6984769def9a","Type":"ContainerDied","Data":"79a1978b5cb87382c34561a4a69e2a24f483b0805144aff517ad974ba1be6716"} Apr 16 18:29:50.871483 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.871438 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw" Apr 16 18:29:50.879959 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.879938 2576 scope.go:117] "RemoveContainer" containerID="c1140b7b91baab608958670ef7c40003bf95082d681f54fdf3fd686da7e45092" Apr 16 18:29:50.880187 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:29:50.880168 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1140b7b91baab608958670ef7c40003bf95082d681f54fdf3fd686da7e45092\": container with ID starting with c1140b7b91baab608958670ef7c40003bf95082d681f54fdf3fd686da7e45092 not found: ID does not exist" containerID="c1140b7b91baab608958670ef7c40003bf95082d681f54fdf3fd686da7e45092" Apr 16 18:29:50.880283 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.880194 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1140b7b91baab608958670ef7c40003bf95082d681f54fdf3fd686da7e45092"} err="failed to get container status \"c1140b7b91baab608958670ef7c40003bf95082d681f54fdf3fd686da7e45092\": rpc error: code = NotFound desc = could not find container \"c1140b7b91baab608958670ef7c40003bf95082d681f54fdf3fd686da7e45092\": container with ID starting with c1140b7b91baab608958670ef7c40003bf95082d681f54fdf3fd686da7e45092 not found: ID does not exist" Apr 16 18:29:50.880283 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.880213 2576 scope.go:117] "RemoveContainer" containerID="f974d5b7efcd0d935ef3dc36b7938e8e78df29df1345b0fb24b0366a751a2bf2" Apr 16 18:29:50.887368 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.887285 2576 scope.go:117] "RemoveContainer" containerID="f974d5b7efcd0d935ef3dc36b7938e8e78df29df1345b0fb24b0366a751a2bf2" Apr 16 18:29:50.887569 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:29:50.887551 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f974d5b7efcd0d935ef3dc36b7938e8e78df29df1345b0fb24b0366a751a2bf2\": container with ID starting with f974d5b7efcd0d935ef3dc36b7938e8e78df29df1345b0fb24b0366a751a2bf2 not found: ID does not exist" containerID="f974d5b7efcd0d935ef3dc36b7938e8e78df29df1345b0fb24b0366a751a2bf2" Apr 16 18:29:50.887617 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.887574 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f974d5b7efcd0d935ef3dc36b7938e8e78df29df1345b0fb24b0366a751a2bf2"} err="failed to get container status \"f974d5b7efcd0d935ef3dc36b7938e8e78df29df1345b0fb24b0366a751a2bf2\": rpc error: code = NotFound desc = could not find container \"f974d5b7efcd0d935ef3dc36b7938e8e78df29df1345b0fb24b0366a751a2bf2\": container with ID starting with f974d5b7efcd0d935ef3dc36b7938e8e78df29df1345b0fb24b0366a751a2bf2 not found: ID does not exist" Apr 16 18:29:50.891430 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.891411 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx"] Apr 16 18:29:50.898023 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.898003 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-68d6a-predictor-56b57ff594-f4qdx"] Apr 16 18:29:50.913018 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.913000 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw"] Apr 16 18:29:50.918104 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:50.918086 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-68d6a-predictor-84c8c878d7-tljnw"] Apr 16 18:29:51.898841 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:51.898807 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27" path="/var/lib/kubelet/pods/8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27/volumes" Apr 16 18:29:51.899199 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:51.899043 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcd952a4-c57d-4b6d-be26-6984769def9a" path="/var/lib/kubelet/pods/fcd952a4-c57d-4b6d-be26-6984769def9a/volumes" Apr 16 18:29:58.858122 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:58.858084 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" podUID="043d90df-836c-46a3-9145-e88f9c1e786f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:29:58.858522 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:29:58.858084 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" podUID="f3470a1d-471d-46f5-bfa1-174b9c19e40d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:30:08.857973 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:08.857926 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" podUID="043d90df-836c-46a3-9145-e88f9c1e786f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:30:08.858410 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:08.857929 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" podUID="f3470a1d-471d-46f5-bfa1-174b9c19e40d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:30:18.080230 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.080192 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627"] Apr 16 18:30:18.080606 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.080447 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" podUID="620cdff5-ebb8-4fce-9e06-726bda85841e" containerName="kserve-container" containerID="cri-o://5ba4f3f330ce75eb7e73769941a2825f7f92ad21aa035d85e08e654e9c8f0795" gracePeriod=30 Apr 16 18:30:18.081692 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.081668 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n"] Apr 16 18:30:18.082031 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.082017 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcd952a4-c57d-4b6d-be26-6984769def9a" containerName="kserve-container" Apr 16 18:30:18.082094 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.082032 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd952a4-c57d-4b6d-be26-6984769def9a" containerName="kserve-container" Apr 16 18:30:18.082094 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.082044 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27" containerName="kserve-container" Apr 16 18:30:18.082094 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.082050 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27" containerName="kserve-container" Apr 16 18:30:18.082220 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.082119 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e2fad16-84cf-4db7-ba08-e7ca4c7e3b27" containerName="kserve-container" Apr 16 18:30:18.082220 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.082128 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fcd952a4-c57d-4b6d-be26-6984769def9a" containerName="kserve-container" Apr 16 18:30:18.086720 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.086703 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" Apr 16 18:30:18.095888 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.095872 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" Apr 16 18:30:18.106829 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.106799 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n"] Apr 16 18:30:18.182329 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.182298 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg"] Apr 16 18:30:18.182638 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.182588 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" podUID="8020656a-80c3-4415-9248-4c823e247e60" containerName="kserve-container" containerID="cri-o://2228802969c264d2debf356ec55dd86d57d4f8dd0b29bb40284518f8f9f26d97" gracePeriod=30 Apr 16 18:30:18.246212 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.246180 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n"] Apr 16 18:30:18.249528 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:30:18.249497 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod176a5137_9f81_42d9_9bae_968764c70863.slice/crio-03d19aa6b0227aa174f1b1f25557454081a5a51c789b11e6ad9faeb3c6032699 WatchSource:0}: Error finding container 03d19aa6b0227aa174f1b1f25557454081a5a51c789b11e6ad9faeb3c6032699: Status 404 returned error can't find the container with id 03d19aa6b0227aa174f1b1f25557454081a5a51c789b11e6ad9faeb3c6032699 Apr 16 18:30:18.285940 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.285910 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9"] Apr 16 18:30:18.291156 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.291132 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" Apr 16 18:30:18.302454 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.302439 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" Apr 16 18:30:18.345016 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.344975 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9"] Apr 16 18:30:18.450860 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.450831 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9"] Apr 16 18:30:18.454323 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:30:18.454295 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b9ab55c_1bd0_4f32_8561_b12fc9b20f77.slice/crio-550169b88fadb75969200aaa48e02ca51b8d7870af8f7aead87cf69b57836f52 WatchSource:0}: Error finding container 550169b88fadb75969200aaa48e02ca51b8d7870af8f7aead87cf69b57836f52: Status 404 returned error can't find the container with id 550169b88fadb75969200aaa48e02ca51b8d7870af8f7aead87cf69b57836f52 Apr 16 18:30:18.857974 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.857930 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" podUID="f3470a1d-471d-46f5-bfa1-174b9c19e40d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:30:18.858152 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.857933 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" podUID="043d90df-836c-46a3-9145-e88f9c1e786f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:30:18.961316 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.961280 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" event={"ID":"176a5137-9f81-42d9-9bae-968764c70863","Type":"ContainerStarted","Data":"92fc941ce422da12ceb907260e907758f3fb0c7400daddac7a601e8a5af0f346"} Apr 16 18:30:18.961316 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.961320 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" event={"ID":"176a5137-9f81-42d9-9bae-968764c70863","Type":"ContainerStarted","Data":"03d19aa6b0227aa174f1b1f25557454081a5a51c789b11e6ad9faeb3c6032699"} Apr 16 18:30:18.961556 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.961442 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" Apr 16 18:30:18.962764 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.962735 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" event={"ID":"7b9ab55c-1bd0-4f32-8561-b12fc9b20f77","Type":"ContainerStarted","Data":"8ecf216c40b602ce98b51a45d766721227af513a8175990adab0a027d6e2d055"} Apr 16 18:30:18.962764 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.962765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" event={"ID":"7b9ab55c-1bd0-4f32-8561-b12fc9b20f77","Type":"ContainerStarted","Data":"550169b88fadb75969200aaa48e02ca51b8d7870af8f7aead87cf69b57836f52"} Apr 16 18:30:18.962907 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.962854 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" podUID="176a5137-9f81-42d9-9bae-968764c70863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:30:18.962907 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.962902 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" Apr 16 18:30:18.963840 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.963817 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" podUID="7b9ab55c-1bd0-4f32-8561-b12fc9b20f77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 18:30:18.990033 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:18.989987 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" podStartSLOduration=0.989974922 podStartE2EDuration="989.974922ms" podCreationTimestamp="2026-04-16 18:30:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:18.987321842 +0000 UTC m=+1219.690653365" watchObservedRunningTime="2026-04-16 18:30:18.989974922 +0000 UTC m=+1219.693306446" Apr 16 18:30:19.686172 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:19.686127 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" podUID="8020656a-80c3-4415-9248-4c823e247e60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 18:30:19.686565 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:19.686129 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" podUID="620cdff5-ebb8-4fce-9e06-726bda85841e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 18:30:19.965956 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:19.965853 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" podUID="7b9ab55c-1bd0-4f32-8561-b12fc9b20f77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 18:30:19.965956 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:19.965853 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" podUID="176a5137-9f81-42d9-9bae-968764c70863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:30:22.135500 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.135470 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" Apr 16 18:30:22.159740 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.159690 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" podStartSLOduration=4.159673265 podStartE2EDuration="4.159673265s" podCreationTimestamp="2026-04-16 18:30:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:19.021682015 +0000 UTC m=+1219.725013538" watchObservedRunningTime="2026-04-16 18:30:22.159673265 +0000 UTC m=+1222.863004789" Apr 16 18:30:22.213811 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.213786 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" Apr 16 18:30:22.977372 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.977340 2576 generic.go:358] "Generic (PLEG): container finished" podID="620cdff5-ebb8-4fce-9e06-726bda85841e" containerID="5ba4f3f330ce75eb7e73769941a2825f7f92ad21aa035d85e08e654e9c8f0795" exitCode=0 Apr 16 18:30:22.977549 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.977405 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" Apr 16 18:30:22.977549 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.977424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" event={"ID":"620cdff5-ebb8-4fce-9e06-726bda85841e","Type":"ContainerDied","Data":"5ba4f3f330ce75eb7e73769941a2825f7f92ad21aa035d85e08e654e9c8f0795"} Apr 16 18:30:22.977549 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.977464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627" event={"ID":"620cdff5-ebb8-4fce-9e06-726bda85841e","Type":"ContainerDied","Data":"31ce64bc26469e9254200831a45fdcad88b87e4020665717bf710cdb8648fb85"} Apr 16 18:30:22.977549 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.977486 2576 scope.go:117] "RemoveContainer" containerID="5ba4f3f330ce75eb7e73769941a2825f7f92ad21aa035d85e08e654e9c8f0795" Apr 16 18:30:22.978685 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.978660 2576 generic.go:358] "Generic (PLEG): container finished" podID="8020656a-80c3-4415-9248-4c823e247e60" containerID="2228802969c264d2debf356ec55dd86d57d4f8dd0b29bb40284518f8f9f26d97" exitCode=0 Apr 16 18:30:22.978809 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.978720 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" Apr 16 18:30:22.978809 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.978758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" event={"ID":"8020656a-80c3-4415-9248-4c823e247e60","Type":"ContainerDied","Data":"2228802969c264d2debf356ec55dd86d57d4f8dd0b29bb40284518f8f9f26d97"} Apr 16 18:30:22.978809 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.978786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg" event={"ID":"8020656a-80c3-4415-9248-4c823e247e60","Type":"ContainerDied","Data":"1f10247271730f7fc5fda910b2580f33a054c676b81cf03e02f3ed568bd6d8c6"} Apr 16 18:30:22.985967 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.985954 2576 scope.go:117] "RemoveContainer" containerID="5ba4f3f330ce75eb7e73769941a2825f7f92ad21aa035d85e08e654e9c8f0795" Apr 16 18:30:22.986299 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:30:22.986277 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba4f3f330ce75eb7e73769941a2825f7f92ad21aa035d85e08e654e9c8f0795\": container with ID starting with 5ba4f3f330ce75eb7e73769941a2825f7f92ad21aa035d85e08e654e9c8f0795 not found: ID does not exist" containerID="5ba4f3f330ce75eb7e73769941a2825f7f92ad21aa035d85e08e654e9c8f0795" Apr 16 18:30:22.986377 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.986309 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba4f3f330ce75eb7e73769941a2825f7f92ad21aa035d85e08e654e9c8f0795"} err="failed to get container status \"5ba4f3f330ce75eb7e73769941a2825f7f92ad21aa035d85e08e654e9c8f0795\": rpc error: code = NotFound desc = could not find container \"5ba4f3f330ce75eb7e73769941a2825f7f92ad21aa035d85e08e654e9c8f0795\": container with ID starting with 5ba4f3f330ce75eb7e73769941a2825f7f92ad21aa035d85e08e654e9c8f0795 not found: ID does not exist" Apr 16 18:30:22.986377 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.986326 2576 scope.go:117] "RemoveContainer" containerID="2228802969c264d2debf356ec55dd86d57d4f8dd0b29bb40284518f8f9f26d97" Apr 16 18:30:22.993637 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.993617 2576 scope.go:117] "RemoveContainer" containerID="2228802969c264d2debf356ec55dd86d57d4f8dd0b29bb40284518f8f9f26d97" Apr 16 18:30:22.993879 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:30:22.993861 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2228802969c264d2debf356ec55dd86d57d4f8dd0b29bb40284518f8f9f26d97\": container with ID starting with 2228802969c264d2debf356ec55dd86d57d4f8dd0b29bb40284518f8f9f26d97 not found: ID does not exist" containerID="2228802969c264d2debf356ec55dd86d57d4f8dd0b29bb40284518f8f9f26d97" Apr 16 18:30:22.993947 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:22.993886 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2228802969c264d2debf356ec55dd86d57d4f8dd0b29bb40284518f8f9f26d97"} err="failed to get container status \"2228802969c264d2debf356ec55dd86d57d4f8dd0b29bb40284518f8f9f26d97\": rpc error: code = NotFound desc = could not find container \"2228802969c264d2debf356ec55dd86d57d4f8dd0b29bb40284518f8f9f26d97\": container with ID starting with 2228802969c264d2debf356ec55dd86d57d4f8dd0b29bb40284518f8f9f26d97 not found: ID does not exist" Apr 16 18:30:23.002937 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:23.002913 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627"] Apr 16 18:30:23.009232 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:23.009206 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-cc45f-predictor-c5c55775-f5627"] Apr 16 18:30:23.021644 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:23.021605 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg"] Apr 16 18:30:23.023414 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:23.023392 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cc45f-predictor-59db55445d-6kzqg"] Apr 16 18:30:23.899553 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:23.899518 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="620cdff5-ebb8-4fce-9e06-726bda85841e" path="/var/lib/kubelet/pods/620cdff5-ebb8-4fce-9e06-726bda85841e/volumes" Apr 16 18:30:23.899914 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:23.899751 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8020656a-80c3-4415-9248-4c823e247e60" path="/var/lib/kubelet/pods/8020656a-80c3-4415-9248-4c823e247e60/volumes" Apr 16 18:30:28.858461 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:28.858419 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" podUID="f3470a1d-471d-46f5-bfa1-174b9c19e40d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:30:28.858816 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:28.858419 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" podUID="043d90df-836c-46a3-9145-e88f9c1e786f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:30:29.966505 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:29.966458 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" podUID="7b9ab55c-1bd0-4f32-8561-b12fc9b20f77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 18:30:29.966873 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:29.966468 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" podUID="176a5137-9f81-42d9-9bae-968764c70863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:30:38.858906 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:38.858864 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" Apr 16 18:30:38.858906 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:38.858914 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" Apr 16 18:30:39.966630 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:39.966585 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" podUID="176a5137-9f81-42d9-9bae-968764c70863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:30:39.967012 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:39.966594 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" podUID="7b9ab55c-1bd0-4f32-8561-b12fc9b20f77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 18:30:49.966797 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:49.966703 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" podUID="176a5137-9f81-42d9-9bae-968764c70863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:30:49.967168 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:49.966709 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" podUID="7b9ab55c-1bd0-4f32-8561-b12fc9b20f77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 18:30:59.966194 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:59.966145 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" podUID="176a5137-9f81-42d9-9bae-968764c70863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 18:30:59.966579 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:30:59.966153 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" podUID="7b9ab55c-1bd0-4f32-8561-b12fc9b20f77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 18:31:08.042948 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.042911 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4"] Apr 16 18:31:08.043315 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.043239 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="620cdff5-ebb8-4fce-9e06-726bda85841e" containerName="kserve-container" Apr 16 18:31:08.043315 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.043275 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="620cdff5-ebb8-4fce-9e06-726bda85841e" containerName="kserve-container" Apr 16 18:31:08.043315 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.043287 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8020656a-80c3-4415-9248-4c823e247e60" containerName="kserve-container" Apr 16 18:31:08.043315 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.043292 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8020656a-80c3-4415-9248-4c823e247e60" containerName="kserve-container" Apr 16 18:31:08.043449 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.043372 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8020656a-80c3-4415-9248-4c823e247e60" containerName="kserve-container" Apr 16 18:31:08.043449 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.043383 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="620cdff5-ebb8-4fce-9e06-726bda85841e" containerName="kserve-container" Apr 16 18:31:08.047668 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.047646 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" Apr 16 18:31:08.057115 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.057099 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" Apr 16 18:31:08.087769 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.087735 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4"] Apr 16 18:31:08.088041 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.088018 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" podUID="043d90df-836c-46a3-9145-e88f9c1e786f" containerName="kserve-container" containerID="cri-o://92f5948b68182f02c9add96d24533c13aa6cf98d6e1f84f62e200e4543ecc124" gracePeriod=30 Apr 16 18:31:08.089596 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.089574 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4"] Apr 16 18:31:08.239684 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.239649 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4"] Apr 16 18:31:08.242808 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:31:08.242776 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod647737af_9671_4afe_8538_e0a4e77b86a2.slice/crio-1510224ef2c370b541ca6a6b05bd51358c56d1ba48a7ee3f8bc3b21462cd2498 WatchSource:0}: Error finding container 1510224ef2c370b541ca6a6b05bd51358c56d1ba48a7ee3f8bc3b21462cd2498: Status 404 returned error can't find the container with id 1510224ef2c370b541ca6a6b05bd51358c56d1ba48a7ee3f8bc3b21462cd2498 Apr 16 18:31:08.458530 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.458490 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg"] Apr 16 18:31:08.461761 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.461737 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" Apr 16 18:31:08.471497 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.471478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" Apr 16 18:31:08.489716 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.489680 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh"] Apr 16 18:31:08.489930 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.489905 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" podUID="f3470a1d-471d-46f5-bfa1-174b9c19e40d" containerName="kserve-container" containerID="cri-o://937f30ea2318f857913d3406557a3ae19bee5103a8c1d4e710d8042f20100845" gracePeriod=30 Apr 16 18:31:08.550000 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.549966 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg"] Apr 16 18:31:08.823483 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.823452 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg"] Apr 16 18:31:08.825666 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:31:08.825628 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4278d34b_1456_4b6a_b349_dc07785eda2f.slice/crio-3efa16bc56cf71aeb56ed87959c543b3c9b35befba64dc47bbd33cf42574e83f WatchSource:0}: Error finding container 3efa16bc56cf71aeb56ed87959c543b3c9b35befba64dc47bbd33cf42574e83f: Status 404 returned error can't find the container with id 3efa16bc56cf71aeb56ed87959c543b3c9b35befba64dc47bbd33cf42574e83f Apr 16 18:31:08.858037 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.858007 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" podUID="f3470a1d-471d-46f5-bfa1-174b9c19e40d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 18:31:08.858134 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:08.858007 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" podUID="043d90df-836c-46a3-9145-e88f9c1e786f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 18:31:09.133760 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:09.133726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" event={"ID":"647737af-9671-4afe-8538-e0a4e77b86a2","Type":"ContainerStarted","Data":"7249183fda850eb63cc5ad004509183f16d5a38b782cc379da21a3529007094d"} Apr 16 18:31:09.134209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:09.133767 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" event={"ID":"647737af-9671-4afe-8538-e0a4e77b86a2","Type":"ContainerStarted","Data":"1510224ef2c370b541ca6a6b05bd51358c56d1ba48a7ee3f8bc3b21462cd2498"} Apr 16 18:31:09.134209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:09.133938 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" Apr 16 18:31:09.135167 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:09.135144 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" event={"ID":"4278d34b-1456-4b6a-b349-dc07785eda2f","Type":"ContainerStarted","Data":"cd5dfcb0a836f85badedc1b34d556375eb3326cf3ec372e6c8ea919c410946b6"} Apr 16 18:31:09.135280 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:09.135174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" event={"ID":"4278d34b-1456-4b6a-b349-dc07785eda2f","Type":"ContainerStarted","Data":"3efa16bc56cf71aeb56ed87959c543b3c9b35befba64dc47bbd33cf42574e83f"} Apr 16 18:31:09.135384 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:09.135365 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" Apr 16 18:31:09.135687 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:09.135666 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" podUID="647737af-9671-4afe-8538-e0a4e77b86a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 18:31:09.136227 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:09.136206 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" podUID="4278d34b-1456-4b6a-b349-dc07785eda2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 18:31:09.171182 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:09.171131 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" podStartSLOduration=2.171117022 podStartE2EDuration="2.171117022s" podCreationTimestamp="2026-04-16 18:31:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:31:09.170258391 +0000 UTC m=+1269.873589905" watchObservedRunningTime="2026-04-16 18:31:09.171117022 +0000 UTC m=+1269.874448525" Apr 16 18:31:09.216097 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:09.216049 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" podStartSLOduration=1.216032963 podStartE2EDuration="1.216032963s" podCreationTimestamp="2026-04-16 18:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:31:09.214789368 +0000 UTC m=+1269.918120903" watchObservedRunningTime="2026-04-16 18:31:09.216032963 +0000 UTC m=+1269.919364542" Apr 16 18:31:09.967328 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:09.967296 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" Apr 16 18:31:09.967571 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:09.967357 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" Apr 16 18:31:10.138685 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:10.138649 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" podUID="4278d34b-1456-4b6a-b349-dc07785eda2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 18:31:10.139044 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:10.138650 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" podUID="647737af-9671-4afe-8538-e0a4e77b86a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 18:31:11.759788 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:11.759766 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" Apr 16 18:31:11.762841 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:11.762822 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" Apr 16 18:31:12.146417 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.146379 2576 generic.go:358] "Generic (PLEG): container finished" podID="043d90df-836c-46a3-9145-e88f9c1e786f" containerID="92f5948b68182f02c9add96d24533c13aa6cf98d6e1f84f62e200e4543ecc124" exitCode=0 Apr 16 18:31:12.146585 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.146443 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" Apr 16 18:31:12.146585 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.146451 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" event={"ID":"043d90df-836c-46a3-9145-e88f9c1e786f","Type":"ContainerDied","Data":"92f5948b68182f02c9add96d24533c13aa6cf98d6e1f84f62e200e4543ecc124"} Apr 16 18:31:12.146585 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.146489 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4" event={"ID":"043d90df-836c-46a3-9145-e88f9c1e786f","Type":"ContainerDied","Data":"622f6eab4325119b293da21b51cec80c64200323ea54bb44ae316bb2de29b4c0"} Apr 16 18:31:12.146585 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.146505 2576 scope.go:117] "RemoveContainer" containerID="92f5948b68182f02c9add96d24533c13aa6cf98d6e1f84f62e200e4543ecc124" Apr 16 18:31:12.147573 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.147552 2576 generic.go:358] "Generic (PLEG): container finished" podID="f3470a1d-471d-46f5-bfa1-174b9c19e40d" containerID="937f30ea2318f857913d3406557a3ae19bee5103a8c1d4e710d8042f20100845" exitCode=0 Apr 16 18:31:12.147682 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.147593 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" event={"ID":"f3470a1d-471d-46f5-bfa1-174b9c19e40d","Type":"ContainerDied","Data":"937f30ea2318f857913d3406557a3ae19bee5103a8c1d4e710d8042f20100845"} Apr 16 18:31:12.147682 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.147610 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" event={"ID":"f3470a1d-471d-46f5-bfa1-174b9c19e40d","Type":"ContainerDied","Data":"58f2a07785622d1be3ba9bc39df2e1ce6f9dd3cfc5d5bcad149ab0ba9870182b"} Apr 16 18:31:12.147682 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.147672 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh" Apr 16 18:31:12.154350 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.154325 2576 scope.go:117] "RemoveContainer" containerID="92f5948b68182f02c9add96d24533c13aa6cf98d6e1f84f62e200e4543ecc124" Apr 16 18:31:12.154597 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:31:12.154581 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92f5948b68182f02c9add96d24533c13aa6cf98d6e1f84f62e200e4543ecc124\": container with ID starting with 92f5948b68182f02c9add96d24533c13aa6cf98d6e1f84f62e200e4543ecc124 not found: ID does not exist" containerID="92f5948b68182f02c9add96d24533c13aa6cf98d6e1f84f62e200e4543ecc124" Apr 16 18:31:12.154660 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.154603 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92f5948b68182f02c9add96d24533c13aa6cf98d6e1f84f62e200e4543ecc124"} err="failed to get container status \"92f5948b68182f02c9add96d24533c13aa6cf98d6e1f84f62e200e4543ecc124\": rpc error: code = NotFound desc = could not find container \"92f5948b68182f02c9add96d24533c13aa6cf98d6e1f84f62e200e4543ecc124\": container with ID starting with 92f5948b68182f02c9add96d24533c13aa6cf98d6e1f84f62e200e4543ecc124 not found: ID does not exist" Apr 16 18:31:12.154660 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.154617 2576 scope.go:117] "RemoveContainer" containerID="937f30ea2318f857913d3406557a3ae19bee5103a8c1d4e710d8042f20100845" Apr 16 18:31:12.161391 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.161378 2576 scope.go:117] "RemoveContainer" containerID="937f30ea2318f857913d3406557a3ae19bee5103a8c1d4e710d8042f20100845" Apr 16 18:31:12.161620 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:31:12.161604 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"937f30ea2318f857913d3406557a3ae19bee5103a8c1d4e710d8042f20100845\": container with ID starting with 937f30ea2318f857913d3406557a3ae19bee5103a8c1d4e710d8042f20100845 not found: ID does not exist" containerID="937f30ea2318f857913d3406557a3ae19bee5103a8c1d4e710d8042f20100845" Apr 16 18:31:12.161675 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.161627 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937f30ea2318f857913d3406557a3ae19bee5103a8c1d4e710d8042f20100845"} err="failed to get container status \"937f30ea2318f857913d3406557a3ae19bee5103a8c1d4e710d8042f20100845\": rpc error: code = NotFound desc = could not find container \"937f30ea2318f857913d3406557a3ae19bee5103a8c1d4e710d8042f20100845\": container with ID starting with 937f30ea2318f857913d3406557a3ae19bee5103a8c1d4e710d8042f20100845 not found: ID does not exist" Apr 16 18:31:12.184879 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.184851 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4"] Apr 16 18:31:12.191933 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.191909 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c659a-predictor-587d4db8d4-kzxw4"] Apr 16 18:31:12.233023 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.232992 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh"] Apr 16 18:31:12.237161 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:12.237136 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c659a-predictor-869ffd459c-2g2gh"] Apr 16 18:31:13.898825 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:13.898787 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="043d90df-836c-46a3-9145-e88f9c1e786f" path="/var/lib/kubelet/pods/043d90df-836c-46a3-9145-e88f9c1e786f/volumes" Apr 16 18:31:13.899205 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:13.899017 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3470a1d-471d-46f5-bfa1-174b9c19e40d" path="/var/lib/kubelet/pods/f3470a1d-471d-46f5-bfa1-174b9c19e40d/volumes" Apr 16 18:31:20.139466 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:20.139422 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" podUID="647737af-9671-4afe-8538-e0a4e77b86a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 18:31:20.139466 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:20.139430 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" podUID="4278d34b-1456-4b6a-b349-dc07785eda2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 18:31:30.138925 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:30.138885 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" podUID="4278d34b-1456-4b6a-b349-dc07785eda2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 18:31:30.139330 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:30.138885 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" podUID="647737af-9671-4afe-8538-e0a4e77b86a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 18:31:40.139126 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:40.139087 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" podUID="4278d34b-1456-4b6a-b349-dc07785eda2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 18:31:40.139126 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:40.139105 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" podUID="647737af-9671-4afe-8538-e0a4e77b86a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 18:31:50.139239 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:50.139198 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" podUID="647737af-9671-4afe-8538-e0a4e77b86a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 18:31:50.139239 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:31:50.139220 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" podUID="4278d34b-1456-4b6a-b349-dc07785eda2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 18:32:00.139916 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:32:00.139877 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" Apr 16 18:32:00.140336 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:32:00.139929 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" Apr 16 18:39:42.819929 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.819880 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n"] Apr 16 18:39:42.822645 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.820153 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" podUID="176a5137-9f81-42d9-9bae-968764c70863" containerName="kserve-container" containerID="cri-o://92fc941ce422da12ceb907260e907758f3fb0c7400daddac7a601e8a5af0f346" gracePeriod=30 Apr 16 18:39:42.876578 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.876547 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt"] Apr 16 18:39:42.876895 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.876881 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3470a1d-471d-46f5-bfa1-174b9c19e40d" containerName="kserve-container" Apr 16 18:39:42.876895 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.876895 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3470a1d-471d-46f5-bfa1-174b9c19e40d" containerName="kserve-container" Apr 16 18:39:42.877030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.876915 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="043d90df-836c-46a3-9145-e88f9c1e786f" containerName="kserve-container" Apr 16 18:39:42.877030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.876921 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="043d90df-836c-46a3-9145-e88f9c1e786f" containerName="kserve-container" Apr 16 18:39:42.877030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.876970 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3470a1d-471d-46f5-bfa1-174b9c19e40d" containerName="kserve-container" Apr 16 18:39:42.877030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.876977 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="043d90df-836c-46a3-9145-e88f9c1e786f" containerName="kserve-container" Apr 16 18:39:42.879972 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.879952 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" Apr 16 18:39:42.887324 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.887298 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9"] Apr 16 18:39:42.887549 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.887521 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" podUID="7b9ab55c-1bd0-4f32-8561-b12fc9b20f77" containerName="kserve-container" containerID="cri-o://8ecf216c40b602ce98b51a45d766721227af513a8175990adab0a027d6e2d055" gracePeriod=30 Apr 16 18:39:42.892445 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.892407 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" Apr 16 18:39:42.892608 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.892509 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt"] Apr 16 18:39:42.921039 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.921013 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8"] Apr 16 18:39:42.924431 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.924412 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" Apr 16 18:39:42.932061 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.932036 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8"] Apr 16 18:39:42.935345 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:42.935322 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" Apr 16 18:39:43.032692 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:43.032660 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt"] Apr 16 18:39:43.036012 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:39:43.035985 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f829679_ab15_4bb0_8524_a33bb5964964.slice/crio-573c4f87603a3834cf390152eab363dbedfd9c82aa688a5e97c94c517f5d73ec WatchSource:0}: Error finding container 573c4f87603a3834cf390152eab363dbedfd9c82aa688a5e97c94c517f5d73ec: Status 404 returned error can't find the container with id 573c4f87603a3834cf390152eab363dbedfd9c82aa688a5e97c94c517f5d73ec Apr 16 18:39:43.038209 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:43.038189 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:39:43.077166 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:43.077054 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8"] Apr 16 18:39:43.079768 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:39:43.079743 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc9a9ac1_318b_438b_8ab5_13cafbc3a4d4.slice/crio-dbcd425b46c820df58d11e2fed94d057262ff2f42702f7335c6a6c3921d3f6ec WatchSource:0}: Error finding container dbcd425b46c820df58d11e2fed94d057262ff2f42702f7335c6a6c3921d3f6ec: Status 404 returned error can't find the container with id dbcd425b46c820df58d11e2fed94d057262ff2f42702f7335c6a6c3921d3f6ec Apr 16 18:39:43.804537 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:43.804498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" event={"ID":"7f829679-ab15-4bb0-8524-a33bb5964964","Type":"ContainerStarted","Data":"59a9f7d3dbf5d0a3d9ebb024c1f93325536891ad9dde0120011d3f8a76530ab7"} Apr 16 18:39:43.804537 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:43.804536 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" event={"ID":"7f829679-ab15-4bb0-8524-a33bb5964964","Type":"ContainerStarted","Data":"573c4f87603a3834cf390152eab363dbedfd9c82aa688a5e97c94c517f5d73ec"} Apr 16 18:39:43.804778 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:43.804670 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" Apr 16 18:39:43.805917 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:43.805894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" event={"ID":"cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4","Type":"ContainerStarted","Data":"d6b772010c270ef240a4c25c2305ec86100672390b4f600061b721ebd9fe33bb"} Apr 16 18:39:43.806034 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:43.805919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" event={"ID":"cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4","Type":"ContainerStarted","Data":"dbcd425b46c820df58d11e2fed94d057262ff2f42702f7335c6a6c3921d3f6ec"} Apr 16 18:39:43.806120 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:43.806103 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" Apr 16 18:39:43.806204 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:43.806169 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" podUID="7f829679-ab15-4bb0-8524-a33bb5964964" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 18:39:43.806961 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:43.806943 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" podUID="cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 18:39:43.822138 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:43.822092 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" podStartSLOduration=1.822080337 podStartE2EDuration="1.822080337s" podCreationTimestamp="2026-04-16 18:39:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:39:43.819607058 +0000 UTC m=+1784.522938583" watchObservedRunningTime="2026-04-16 18:39:43.822080337 +0000 UTC m=+1784.525411861" Apr 16 18:39:43.833129 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:43.833077 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" podStartSLOduration=1.8330636500000002 podStartE2EDuration="1.83306365s" podCreationTimestamp="2026-04-16 18:39:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:39:43.83227259 +0000 UTC m=+1784.535604126" watchObservedRunningTime="2026-04-16 18:39:43.83306365 +0000 UTC m=+1784.536395175" Apr 16 18:39:44.809327 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:44.809281 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" podUID="7f829679-ab15-4bb0-8524-a33bb5964964" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 18:39:44.809525 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:44.809456 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" podUID="cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 18:39:46.063611 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:46.063588 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" Apr 16 18:39:46.817438 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:46.817399 2576 generic.go:358] "Generic (PLEG): container finished" podID="176a5137-9f81-42d9-9bae-968764c70863" containerID="92fc941ce422da12ceb907260e907758f3fb0c7400daddac7a601e8a5af0f346" exitCode=0 Apr 16 18:39:46.817608 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:46.817470 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" Apr 16 18:39:46.817608 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:46.817479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" event={"ID":"176a5137-9f81-42d9-9bae-968764c70863","Type":"ContainerDied","Data":"92fc941ce422da12ceb907260e907758f3fb0c7400daddac7a601e8a5af0f346"} Apr 16 18:39:46.817608 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:46.817519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n" event={"ID":"176a5137-9f81-42d9-9bae-968764c70863","Type":"ContainerDied","Data":"03d19aa6b0227aa174f1b1f25557454081a5a51c789b11e6ad9faeb3c6032699"} Apr 16 18:39:46.817608 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:46.817536 2576 scope.go:117] "RemoveContainer" containerID="92fc941ce422da12ceb907260e907758f3fb0c7400daddac7a601e8a5af0f346" Apr 16 18:39:46.826359 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:46.826339 2576 scope.go:117] "RemoveContainer" containerID="92fc941ce422da12ceb907260e907758f3fb0c7400daddac7a601e8a5af0f346" Apr 16 18:39:46.826807 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:39:46.826780 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92fc941ce422da12ceb907260e907758f3fb0c7400daddac7a601e8a5af0f346\": container with ID starting with 92fc941ce422da12ceb907260e907758f3fb0c7400daddac7a601e8a5af0f346 not found: ID does not exist" containerID="92fc941ce422da12ceb907260e907758f3fb0c7400daddac7a601e8a5af0f346" Apr 16 18:39:46.826906 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:46.826818 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92fc941ce422da12ceb907260e907758f3fb0c7400daddac7a601e8a5af0f346"} err="failed to get container status \"92fc941ce422da12ceb907260e907758f3fb0c7400daddac7a601e8a5af0f346\": rpc error: code = NotFound desc = could not find container \"92fc941ce422da12ceb907260e907758f3fb0c7400daddac7a601e8a5af0f346\": container with ID starting with 92fc941ce422da12ceb907260e907758f3fb0c7400daddac7a601e8a5af0f346 not found: ID does not exist" Apr 16 18:39:46.838577 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:46.838551 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n"] Apr 16 18:39:46.842309 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:46.842291 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-bac14-predictor-6df98975fd-g7z7n"] Apr 16 18:39:47.899359 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:47.899329 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="176a5137-9f81-42d9-9bae-968764c70863" path="/var/lib/kubelet/pods/176a5137-9f81-42d9-9bae-968764c70863/volumes" Apr 16 18:39:49.135022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:49.134989 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" Apr 16 18:39:49.830647 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:49.830606 2576 generic.go:358] "Generic (PLEG): container finished" podID="7b9ab55c-1bd0-4f32-8561-b12fc9b20f77" containerID="8ecf216c40b602ce98b51a45d766721227af513a8175990adab0a027d6e2d055" exitCode=0 Apr 16 18:39:49.830845 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:49.830671 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" Apr 16 18:39:49.830845 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:49.830683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" event={"ID":"7b9ab55c-1bd0-4f32-8561-b12fc9b20f77","Type":"ContainerDied","Data":"8ecf216c40b602ce98b51a45d766721227af513a8175990adab0a027d6e2d055"} Apr 16 18:39:49.830845 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:49.830720 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9" event={"ID":"7b9ab55c-1bd0-4f32-8561-b12fc9b20f77","Type":"ContainerDied","Data":"550169b88fadb75969200aaa48e02ca51b8d7870af8f7aead87cf69b57836f52"} Apr 16 18:39:49.830845 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:49.830738 2576 scope.go:117] "RemoveContainer" containerID="8ecf216c40b602ce98b51a45d766721227af513a8175990adab0a027d6e2d055" Apr 16 18:39:49.839294 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:49.839270 2576 scope.go:117] "RemoveContainer" containerID="8ecf216c40b602ce98b51a45d766721227af513a8175990adab0a027d6e2d055" Apr 16 18:39:49.839523 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:39:49.839504 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ecf216c40b602ce98b51a45d766721227af513a8175990adab0a027d6e2d055\": container with ID starting with 8ecf216c40b602ce98b51a45d766721227af513a8175990adab0a027d6e2d055 not found: ID does not exist" containerID="8ecf216c40b602ce98b51a45d766721227af513a8175990adab0a027d6e2d055" Apr 16 18:39:49.839580 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:49.839530 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ecf216c40b602ce98b51a45d766721227af513a8175990adab0a027d6e2d055"} err="failed to get container status \"8ecf216c40b602ce98b51a45d766721227af513a8175990adab0a027d6e2d055\": rpc error: code = NotFound desc = could not find container \"8ecf216c40b602ce98b51a45d766721227af513a8175990adab0a027d6e2d055\": container with ID starting with 8ecf216c40b602ce98b51a45d766721227af513a8175990adab0a027d6e2d055 not found: ID does not exist" Apr 16 18:39:49.851777 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:49.851748 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9"] Apr 16 18:39:49.857436 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:49.857414 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-bac14-predictor-5465cc4b59-fkqw9"] Apr 16 18:39:49.905082 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:49.905053 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9ab55c-1bd0-4f32-8561-b12fc9b20f77" path="/var/lib/kubelet/pods/7b9ab55c-1bd0-4f32-8561-b12fc9b20f77/volumes" Apr 16 18:39:54.809467 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:54.809429 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" podUID="cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 18:39:54.809851 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:39:54.809448 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" podUID="7f829679-ab15-4bb0-8524-a33bb5964964" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 18:40:04.809344 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:04.809300 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" podUID="7f829679-ab15-4bb0-8524-a33bb5964964" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 18:40:04.809707 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:04.809300 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" podUID="cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 18:40:14.809880 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:14.809840 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" podUID="cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 18:40:14.810415 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:14.809843 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" podUID="7f829679-ab15-4bb0-8524-a33bb5964964" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 18:40:24.809755 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:24.809711 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" podUID="cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 18:40:24.810224 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:24.809716 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" podUID="7f829679-ab15-4bb0-8524-a33bb5964964" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 18:40:32.660636 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.660597 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg"] Apr 16 18:40:32.660998 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.660856 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" podUID="4278d34b-1456-4b6a-b349-dc07785eda2f" containerName="kserve-container" containerID="cri-o://cd5dfcb0a836f85badedc1b34d556375eb3326cf3ec372e6c8ea919c410946b6" gracePeriod=30 Apr 16 18:40:32.707373 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.707342 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4"] Apr 16 18:40:32.707592 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.707564 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" podUID="647737af-9671-4afe-8538-e0a4e77b86a2" containerName="kserve-container" containerID="cri-o://7249183fda850eb63cc5ad004509183f16d5a38b782cc379da21a3529007094d" gracePeriod=30 Apr 16 18:40:32.730224 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.730195 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv"] Apr 16 18:40:32.730558 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.730545 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b9ab55c-1bd0-4f32-8561-b12fc9b20f77" containerName="kserve-container" Apr 16 18:40:32.730558 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.730560 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9ab55c-1bd0-4f32-8561-b12fc9b20f77" containerName="kserve-container" Apr 16 18:40:32.730638 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.730578 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="176a5137-9f81-42d9-9bae-968764c70863" containerName="kserve-container" Apr 16 18:40:32.730638 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.730584 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="176a5137-9f81-42d9-9bae-968764c70863" containerName="kserve-container" Apr 16 18:40:32.730702 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.730648 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b9ab55c-1bd0-4f32-8561-b12fc9b20f77" containerName="kserve-container" Apr 16 18:40:32.730702 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.730659 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="176a5137-9f81-42d9-9bae-968764c70863" containerName="kserve-container" Apr 16 18:40:32.733476 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.733457 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" Apr 16 18:40:32.741240 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.741213 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv"] Apr 16 18:40:32.743860 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.743841 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" Apr 16 18:40:32.818701 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.818669 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6"] Apr 16 18:40:32.823082 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.823056 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" Apr 16 18:40:32.829496 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.828912 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6"] Apr 16 18:40:32.836595 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.836577 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" Apr 16 18:40:32.878518 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.878484 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv"] Apr 16 18:40:32.881743 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:40:32.881714 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e645ea6_f77f_4abb_86cd_c23525ea395e.slice/crio-c2ddb1f9450d246100d0b9f61a8ff4a7908176e67470985229c64ebd2467fedb WatchSource:0}: Error finding container c2ddb1f9450d246100d0b9f61a8ff4a7908176e67470985229c64ebd2467fedb: Status 404 returned error can't find the container with id c2ddb1f9450d246100d0b9f61a8ff4a7908176e67470985229c64ebd2467fedb Apr 16 18:40:32.982005 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.981984 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6"] Apr 16 18:40:32.982457 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.982427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" event={"ID":"2e645ea6-f77f-4abb-86cd-c23525ea395e","Type":"ContainerStarted","Data":"4b4eeafeb9a99189ef9fdd918f467c8fda8c54e8ff82f465440ceb875bd96865"} Apr 16 18:40:32.982457 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.982459 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" event={"ID":"2e645ea6-f77f-4abb-86cd-c23525ea395e","Type":"ContainerStarted","Data":"c2ddb1f9450d246100d0b9f61a8ff4a7908176e67470985229c64ebd2467fedb"} Apr 16 18:40:32.982671 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.982656 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" Apr 16 18:40:32.983834 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.983806 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" podUID="2e645ea6-f77f-4abb-86cd-c23525ea395e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 18:40:32.984163 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:40:32.984140 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f4850c_8d86_4234_926d_3f0f9b821487.slice/crio-0fd2d585ab918056c26d770e386f3fa6ce1049e1a5b28ce6d5ecdce40f92d1f2 WatchSource:0}: Error finding container 0fd2d585ab918056c26d770e386f3fa6ce1049e1a5b28ce6d5ecdce40f92d1f2: Status 404 returned error can't find the container with id 0fd2d585ab918056c26d770e386f3fa6ce1049e1a5b28ce6d5ecdce40f92d1f2 Apr 16 18:40:32.998822 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:32.998783 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" podStartSLOduration=0.998769335 podStartE2EDuration="998.769335ms" podCreationTimestamp="2026-04-16 18:40:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:40:32.996945299 +0000 UTC m=+1833.700276818" watchObservedRunningTime="2026-04-16 18:40:32.998769335 +0000 UTC m=+1833.702100858" Apr 16 18:40:33.986556 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:33.986514 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" event={"ID":"76f4850c-8d86-4234-926d-3f0f9b821487","Type":"ContainerStarted","Data":"d88d28430f0c1586b15f25e653580ece7fbe9d4d1b240cbb325bde2119e7c3bb"} Apr 16 18:40:33.986556 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:33.986555 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" event={"ID":"76f4850c-8d86-4234-926d-3f0f9b821487","Type":"ContainerStarted","Data":"0fd2d585ab918056c26d770e386f3fa6ce1049e1a5b28ce6d5ecdce40f92d1f2"} Apr 16 18:40:33.987092 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:33.986874 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" podUID="2e645ea6-f77f-4abb-86cd-c23525ea395e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 18:40:34.002370 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:34.002329 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" podStartSLOduration=2.00231602 podStartE2EDuration="2.00231602s" podCreationTimestamp="2026-04-16 18:40:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:40:34.001018043 +0000 UTC m=+1834.704349585" watchObservedRunningTime="2026-04-16 18:40:34.00231602 +0000 UTC m=+1834.705647543" Apr 16 18:40:34.810611 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:34.810575 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" Apr 16 18:40:34.811065 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:34.811043 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" Apr 16 18:40:34.989825 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:34.989788 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" Apr 16 18:40:34.991183 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:34.991149 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" podUID="76f4850c-8d86-4234-926d-3f0f9b821487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 18:40:35.993729 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:35.993685 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" podUID="76f4850c-8d86-4234-926d-3f0f9b821487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 18:40:36.148586 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:36.148565 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" Apr 16 18:40:36.601888 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:36.601865 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" Apr 16 18:40:36.997531 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:36.997495 2576 generic.go:358] "Generic (PLEG): container finished" podID="647737af-9671-4afe-8538-e0a4e77b86a2" containerID="7249183fda850eb63cc5ad004509183f16d5a38b782cc379da21a3529007094d" exitCode=0 Apr 16 18:40:36.997964 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:36.997580 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" Apr 16 18:40:36.997964 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:36.997583 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" event={"ID":"647737af-9671-4afe-8538-e0a4e77b86a2","Type":"ContainerDied","Data":"7249183fda850eb63cc5ad004509183f16d5a38b782cc379da21a3529007094d"} Apr 16 18:40:36.997964 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:36.997630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4" event={"ID":"647737af-9671-4afe-8538-e0a4e77b86a2","Type":"ContainerDied","Data":"1510224ef2c370b541ca6a6b05bd51358c56d1ba48a7ee3f8bc3b21462cd2498"} Apr 16 18:40:36.997964 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:36.997652 2576 scope.go:117] "RemoveContainer" containerID="7249183fda850eb63cc5ad004509183f16d5a38b782cc379da21a3529007094d" Apr 16 18:40:36.998760 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:36.998729 2576 generic.go:358] "Generic (PLEG): container finished" podID="4278d34b-1456-4b6a-b349-dc07785eda2f" containerID="cd5dfcb0a836f85badedc1b34d556375eb3326cf3ec372e6c8ea919c410946b6" exitCode=0 Apr 16 18:40:36.998902 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:36.998811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" event={"ID":"4278d34b-1456-4b6a-b349-dc07785eda2f","Type":"ContainerDied","Data":"cd5dfcb0a836f85badedc1b34d556375eb3326cf3ec372e6c8ea919c410946b6"} Apr 16 18:40:36.998902 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:36.998848 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" event={"ID":"4278d34b-1456-4b6a-b349-dc07785eda2f","Type":"ContainerDied","Data":"3efa16bc56cf71aeb56ed87959c543b3c9b35befba64dc47bbd33cf42574e83f"} Apr 16 18:40:36.998902 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:36.998817 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg" Apr 16 18:40:37.007021 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:37.006991 2576 scope.go:117] "RemoveContainer" containerID="7249183fda850eb63cc5ad004509183f16d5a38b782cc379da21a3529007094d" Apr 16 18:40:37.007289 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:40:37.007269 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7249183fda850eb63cc5ad004509183f16d5a38b782cc379da21a3529007094d\": container with ID starting with 7249183fda850eb63cc5ad004509183f16d5a38b782cc379da21a3529007094d not found: ID does not exist" containerID="7249183fda850eb63cc5ad004509183f16d5a38b782cc379da21a3529007094d" Apr 16 18:40:37.007383 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:37.007297 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7249183fda850eb63cc5ad004509183f16d5a38b782cc379da21a3529007094d"} err="failed to get container status \"7249183fda850eb63cc5ad004509183f16d5a38b782cc379da21a3529007094d\": rpc error: code = NotFound desc = could not find container \"7249183fda850eb63cc5ad004509183f16d5a38b782cc379da21a3529007094d\": container with ID starting with 7249183fda850eb63cc5ad004509183f16d5a38b782cc379da21a3529007094d not found: ID does not exist" Apr 16 18:40:37.007383 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:37.007314 2576 scope.go:117] "RemoveContainer" containerID="cd5dfcb0a836f85badedc1b34d556375eb3326cf3ec372e6c8ea919c410946b6" Apr 16 18:40:37.014138 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:37.014123 2576 scope.go:117] "RemoveContainer" containerID="cd5dfcb0a836f85badedc1b34d556375eb3326cf3ec372e6c8ea919c410946b6" Apr 16 18:40:37.014417 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:40:37.014401 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd5dfcb0a836f85badedc1b34d556375eb3326cf3ec372e6c8ea919c410946b6\": container with ID starting with cd5dfcb0a836f85badedc1b34d556375eb3326cf3ec372e6c8ea919c410946b6 not found: ID does not exist" containerID="cd5dfcb0a836f85badedc1b34d556375eb3326cf3ec372e6c8ea919c410946b6" Apr 16 18:40:37.014464 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:37.014424 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd5dfcb0a836f85badedc1b34d556375eb3326cf3ec372e6c8ea919c410946b6"} err="failed to get container status \"cd5dfcb0a836f85badedc1b34d556375eb3326cf3ec372e6c8ea919c410946b6\": rpc error: code = NotFound desc = could not find container \"cd5dfcb0a836f85badedc1b34d556375eb3326cf3ec372e6c8ea919c410946b6\": container with ID starting with cd5dfcb0a836f85badedc1b34d556375eb3326cf3ec372e6c8ea919c410946b6 not found: ID does not exist" Apr 16 18:40:37.022014 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:37.021993 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg"] Apr 16 18:40:37.025519 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:37.025499 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e9680-predictor-8689cf6cdf-k7trg"] Apr 16 18:40:37.036107 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:37.036088 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4"] Apr 16 18:40:37.045731 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:37.045707 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e9680-predictor-7bc5c87f48-dsrc4"] Apr 16 18:40:37.898883 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:37.898851 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4278d34b-1456-4b6a-b349-dc07785eda2f" path="/var/lib/kubelet/pods/4278d34b-1456-4b6a-b349-dc07785eda2f/volumes" Apr 16 18:40:37.899085 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:37.899072 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="647737af-9671-4afe-8538-e0a4e77b86a2" path="/var/lib/kubelet/pods/647737af-9671-4afe-8538-e0a4e77b86a2/volumes" Apr 16 18:40:43.987195 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:43.987153 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" podUID="2e645ea6-f77f-4abb-86cd-c23525ea395e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 18:40:45.994170 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:45.994120 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" podUID="76f4850c-8d86-4234-926d-3f0f9b821487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 18:40:53.986991 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:53.986949 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" podUID="2e645ea6-f77f-4abb-86cd-c23525ea395e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 18:40:55.994457 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:40:55.994408 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" podUID="76f4850c-8d86-4234-926d-3f0f9b821487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 18:41:03.188022 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.187983 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8"] Apr 16 18:41:03.191030 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.191003 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" podUID="cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4" containerName="kserve-container" containerID="cri-o://d6b772010c270ef240a4c25c2305ec86100672390b4f600061b721ebd9fe33bb" gracePeriod=30 Apr 16 18:41:03.251888 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.251852 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt"] Apr 16 18:41:03.252130 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.252106 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" podUID="7f829679-ab15-4bb0-8524-a33bb5964964" containerName="kserve-container" containerID="cri-o://59a9f7d3dbf5d0a3d9ebb024c1f93325536891ad9dde0120011d3f8a76530ab7" gracePeriod=30 Apr 16 18:41:03.276012 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.275981 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r"] Apr 16 18:41:03.276380 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.276366 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="647737af-9671-4afe-8538-e0a4e77b86a2" containerName="kserve-container" Apr 16 18:41:03.276380 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.276381 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="647737af-9671-4afe-8538-e0a4e77b86a2" containerName="kserve-container" Apr 16 18:41:03.276483 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.276393 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4278d34b-1456-4b6a-b349-dc07785eda2f" containerName="kserve-container" Apr 16 18:41:03.276483 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.276399 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4278d34b-1456-4b6a-b349-dc07785eda2f" containerName="kserve-container" Apr 16 18:41:03.276483 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.276446 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4278d34b-1456-4b6a-b349-dc07785eda2f" containerName="kserve-container" Apr 16 18:41:03.276483 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.276455 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="647737af-9671-4afe-8538-e0a4e77b86a2" containerName="kserve-container" Apr 16 18:41:03.280772 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.280748 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" Apr 16 18:41:03.291582 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.291560 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" Apr 16 18:41:03.292411 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.292391 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r"] Apr 16 18:41:03.380405 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.380375 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p"] Apr 16 18:41:03.385267 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.385229 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" Apr 16 18:41:03.395040 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.394997 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p"] Apr 16 18:41:03.397790 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.397771 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" Apr 16 18:41:03.430808 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.430516 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r"] Apr 16 18:41:03.438138 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:41:03.438027 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod240de76b_805e_45c1_a36a_5c49c583f087.slice/crio-24aebb90fd622f4734a28eb79a5b6649e68e70f1234c1faaccb5260140d5cb45 WatchSource:0}: Error finding container 24aebb90fd622f4734a28eb79a5b6649e68e70f1234c1faaccb5260140d5cb45: Status 404 returned error can't find the container with id 24aebb90fd622f4734a28eb79a5b6649e68e70f1234c1faaccb5260140d5cb45 Apr 16 18:41:03.538238 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.538212 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p"] Apr 16 18:41:03.540740 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:41:03.540706 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bbea509_f542_489a_85da_ec07e5fa4820.slice/crio-1cbe003db4e3e5919dbdf432efb563cb7c75481c74207111e3cb72492f96126f WatchSource:0}: Error finding container 1cbe003db4e3e5919dbdf432efb563cb7c75481c74207111e3cb72492f96126f: Status 404 returned error can't find the container with id 1cbe003db4e3e5919dbdf432efb563cb7c75481c74207111e3cb72492f96126f Apr 16 18:41:03.987949 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:03.987905 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" podUID="2e645ea6-f77f-4abb-86cd-c23525ea395e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 18:41:04.089119 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:04.089079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" event={"ID":"7bbea509-f542-489a-85da-ec07e5fa4820","Type":"ContainerStarted","Data":"cc704189703d70e9432fd07668d4beeb7b5471466df0ef48dc7a875d3a155562"} Apr 16 18:41:04.089119 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:04.089123 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" event={"ID":"7bbea509-f542-489a-85da-ec07e5fa4820","Type":"ContainerStarted","Data":"1cbe003db4e3e5919dbdf432efb563cb7c75481c74207111e3cb72492f96126f"} Apr 16 18:41:04.089390 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:04.089146 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" Apr 16 18:41:04.090405 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:04.090372 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" podUID="7bbea509-f542-489a-85da-ec07e5fa4820" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:41:04.090638 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:04.090615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" event={"ID":"240de76b-805e-45c1-a36a-5c49c583f087","Type":"ContainerStarted","Data":"d04b47e2f03a50deb8f2485dd1b8c8b4d2816149ece7cf2250523c709d1278f0"} Apr 16 18:41:04.090712 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:04.090648 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" event={"ID":"240de76b-805e-45c1-a36a-5c49c583f087","Type":"ContainerStarted","Data":"24aebb90fd622f4734a28eb79a5b6649e68e70f1234c1faaccb5260140d5cb45"} Apr 16 18:41:04.090832 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:04.090811 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" Apr 16 18:41:04.091860 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:04.091838 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" podUID="240de76b-805e-45c1-a36a-5c49c583f087" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:41:04.104784 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:04.104746 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" podStartSLOduration=1.104733889 podStartE2EDuration="1.104733889s" podCreationTimestamp="2026-04-16 18:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:41:04.103780206 +0000 UTC m=+1864.807111728" watchObservedRunningTime="2026-04-16 18:41:04.104733889 +0000 UTC m=+1864.808065497" Apr 16 18:41:04.118673 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:04.118633 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" podStartSLOduration=1.118622303 podStartE2EDuration="1.118622303s" podCreationTimestamp="2026-04-16 18:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:41:04.116826118 +0000 UTC m=+1864.820157642" watchObservedRunningTime="2026-04-16 18:41:04.118622303 +0000 UTC m=+1864.821953826" Apr 16 18:41:04.809532 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:04.809484 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" podUID="7f829679-ab15-4bb0-8524-a33bb5964964" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 18:41:04.809913 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:04.809484 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" podUID="cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 18:41:05.093856 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:05.093762 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" podUID="240de76b-805e-45c1-a36a-5c49c583f087" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:41:05.094001 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:05.093762 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" podUID="7bbea509-f542-489a-85da-ec07e5fa4820" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:41:05.994486 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:05.994446 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" podUID="76f4850c-8d86-4234-926d-3f0f9b821487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 18:41:06.391954 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:06.391931 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" Apr 16 18:41:07.101813 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:07.101782 2576 generic.go:358] "Generic (PLEG): container finished" podID="cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4" containerID="d6b772010c270ef240a4c25c2305ec86100672390b4f600061b721ebd9fe33bb" exitCode=0 Apr 16 18:41:07.102207 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:07.101854 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" event={"ID":"cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4","Type":"ContainerDied","Data":"d6b772010c270ef240a4c25c2305ec86100672390b4f600061b721ebd9fe33bb"} Apr 16 18:41:07.103027 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:07.103004 2576 generic.go:358] "Generic (PLEG): container finished" podID="7f829679-ab15-4bb0-8524-a33bb5964964" containerID="59a9f7d3dbf5d0a3d9ebb024c1f93325536891ad9dde0120011d3f8a76530ab7" exitCode=0 Apr 16 18:41:07.103129 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:07.103039 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" event={"ID":"7f829679-ab15-4bb0-8524-a33bb5964964","Type":"ContainerDied","Data":"59a9f7d3dbf5d0a3d9ebb024c1f93325536891ad9dde0120011d3f8a76530ab7"} Apr 16 18:41:07.103129 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:07.103065 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" event={"ID":"7f829679-ab15-4bb0-8524-a33bb5964964","Type":"ContainerDied","Data":"573c4f87603a3834cf390152eab363dbedfd9c82aa688a5e97c94c517f5d73ec"} Apr 16 18:41:07.103129 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:07.103065 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt" Apr 16 18:41:07.103129 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:07.103086 2576 scope.go:117] "RemoveContainer" containerID="59a9f7d3dbf5d0a3d9ebb024c1f93325536891ad9dde0120011d3f8a76530ab7" Apr 16 18:41:07.110815 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:07.110798 2576 scope.go:117] "RemoveContainer" containerID="59a9f7d3dbf5d0a3d9ebb024c1f93325536891ad9dde0120011d3f8a76530ab7" Apr 16 18:41:07.111058 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:41:07.111041 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a9f7d3dbf5d0a3d9ebb024c1f93325536891ad9dde0120011d3f8a76530ab7\": container with ID starting with 59a9f7d3dbf5d0a3d9ebb024c1f93325536891ad9dde0120011d3f8a76530ab7 not found: ID does not exist" containerID="59a9f7d3dbf5d0a3d9ebb024c1f93325536891ad9dde0120011d3f8a76530ab7" Apr 16 18:41:07.111129 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:07.111068 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a9f7d3dbf5d0a3d9ebb024c1f93325536891ad9dde0120011d3f8a76530ab7"} err="failed to get container status \"59a9f7d3dbf5d0a3d9ebb024c1f93325536891ad9dde0120011d3f8a76530ab7\": rpc error: code = NotFound desc = could not find container \"59a9f7d3dbf5d0a3d9ebb024c1f93325536891ad9dde0120011d3f8a76530ab7\": container with ID starting with 59a9f7d3dbf5d0a3d9ebb024c1f93325536891ad9dde0120011d3f8a76530ab7 not found: ID does not exist" Apr 16 18:41:07.127474 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:07.127300 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt"] Apr 16 18:41:07.129677 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:07.129656 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e04f1-predictor-df557c688-bbnbt"] Apr 16 18:41:07.136694 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:07.136671 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" Apr 16 18:41:07.903740 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:07.903703 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f829679-ab15-4bb0-8524-a33bb5964964" path="/var/lib/kubelet/pods/7f829679-ab15-4bb0-8524-a33bb5964964/volumes" Apr 16 18:41:08.106932 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:08.106900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" event={"ID":"cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4","Type":"ContainerDied","Data":"dbcd425b46c820df58d11e2fed94d057262ff2f42702f7335c6a6c3921d3f6ec"} Apr 16 18:41:08.106932 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:08.106923 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8" Apr 16 18:41:08.107492 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:08.106942 2576 scope.go:117] "RemoveContainer" containerID="d6b772010c270ef240a4c25c2305ec86100672390b4f600061b721ebd9fe33bb" Apr 16 18:41:08.121528 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:08.121506 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8"] Apr 16 18:41:08.127929 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:08.127902 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e04f1-predictor-6cc8df4c7c-blqf8"] Apr 16 18:41:09.899328 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:09.899294 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4" path="/var/lib/kubelet/pods/cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4/volumes" Apr 16 18:41:13.987755 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:13.987663 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" podUID="2e645ea6-f77f-4abb-86cd-c23525ea395e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 18:41:15.094170 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:15.094130 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" podUID="7bbea509-f542-489a-85da-ec07e5fa4820" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:41:15.094567 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:15.094131 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" podUID="240de76b-805e-45c1-a36a-5c49c583f087" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:41:15.993925 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:15.993880 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" podUID="76f4850c-8d86-4234-926d-3f0f9b821487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 18:41:23.987951 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:23.987914 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" Apr 16 18:41:25.094759 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:25.094722 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" podUID="240de76b-805e-45c1-a36a-5c49c583f087" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:41:25.095137 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:25.094722 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" podUID="7bbea509-f542-489a-85da-ec07e5fa4820" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:41:25.995447 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:25.995415 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" Apr 16 18:41:35.094165 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:35.094116 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" podUID="7bbea509-f542-489a-85da-ec07e5fa4820" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:41:35.094562 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:35.094118 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" podUID="240de76b-805e-45c1-a36a-5c49c583f087" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:41:45.093922 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:45.093878 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" podUID="7bbea509-f542-489a-85da-ec07e5fa4820" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 18:41:45.094306 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:45.093878 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" podUID="240de76b-805e-45c1-a36a-5c49c583f087" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 18:41:55.094418 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:55.094390 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" Apr 16 18:41:55.094990 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:41:55.094918 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" Apr 16 18:50:28.171854 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:28.171772 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r"] Apr 16 18:50:28.174232 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:28.172065 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" podUID="240de76b-805e-45c1-a36a-5c49c583f087" containerName="kserve-container" containerID="cri-o://d04b47e2f03a50deb8f2485dd1b8c8b4d2816149ece7cf2250523c709d1278f0" gracePeriod=30 Apr 16 18:50:28.209393 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:28.209363 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p"] Apr 16 18:50:28.209599 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:28.209580 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" podUID="7bbea509-f542-489a-85da-ec07e5fa4820" containerName="kserve-container" containerID="cri-o://cc704189703d70e9432fd07668d4beeb7b5471466df0ef48dc7a875d3a155562" gracePeriod=30 Apr 16 18:50:31.355211 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.355184 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" Apr 16 18:50:31.421454 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.421432 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" Apr 16 18:50:31.913327 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.913301 2576 generic.go:358] "Generic (PLEG): container finished" podID="240de76b-805e-45c1-a36a-5c49c583f087" containerID="d04b47e2f03a50deb8f2485dd1b8c8b4d2816149ece7cf2250523c709d1278f0" exitCode=0 Apr 16 18:50:31.913449 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.913373 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" Apr 16 18:50:31.913449 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.913379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" event={"ID":"240de76b-805e-45c1-a36a-5c49c583f087","Type":"ContainerDied","Data":"d04b47e2f03a50deb8f2485dd1b8c8b4d2816149ece7cf2250523c709d1278f0"} Apr 16 18:50:31.913449 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.913417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r" event={"ID":"240de76b-805e-45c1-a36a-5c49c583f087","Type":"ContainerDied","Data":"24aebb90fd622f4734a28eb79a5b6649e68e70f1234c1faaccb5260140d5cb45"} Apr 16 18:50:31.913449 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.913438 2576 scope.go:117] "RemoveContainer" containerID="d04b47e2f03a50deb8f2485dd1b8c8b4d2816149ece7cf2250523c709d1278f0" Apr 16 18:50:31.914538 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.914516 2576 generic.go:358] "Generic (PLEG): container finished" podID="7bbea509-f542-489a-85da-ec07e5fa4820" containerID="cc704189703d70e9432fd07668d4beeb7b5471466df0ef48dc7a875d3a155562" exitCode=0 Apr 16 18:50:31.914636 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.914565 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" Apr 16 18:50:31.914636 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.914600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" event={"ID":"7bbea509-f542-489a-85da-ec07e5fa4820","Type":"ContainerDied","Data":"cc704189703d70e9432fd07668d4beeb7b5471466df0ef48dc7a875d3a155562"} Apr 16 18:50:31.914636 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.914632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p" event={"ID":"7bbea509-f542-489a-85da-ec07e5fa4820","Type":"ContainerDied","Data":"1cbe003db4e3e5919dbdf432efb563cb7c75481c74207111e3cb72492f96126f"} Apr 16 18:50:31.921476 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.921441 2576 scope.go:117] "RemoveContainer" containerID="d04b47e2f03a50deb8f2485dd1b8c8b4d2816149ece7cf2250523c709d1278f0" Apr 16 18:50:31.921689 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:50:31.921668 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d04b47e2f03a50deb8f2485dd1b8c8b4d2816149ece7cf2250523c709d1278f0\": container with ID starting with d04b47e2f03a50deb8f2485dd1b8c8b4d2816149ece7cf2250523c709d1278f0 not found: ID does not exist" containerID="d04b47e2f03a50deb8f2485dd1b8c8b4d2816149ece7cf2250523c709d1278f0" Apr 16 18:50:31.921773 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.921695 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d04b47e2f03a50deb8f2485dd1b8c8b4d2816149ece7cf2250523c709d1278f0"} err="failed to get container status \"d04b47e2f03a50deb8f2485dd1b8c8b4d2816149ece7cf2250523c709d1278f0\": rpc error: code = NotFound desc = could not find container \"d04b47e2f03a50deb8f2485dd1b8c8b4d2816149ece7cf2250523c709d1278f0\": container with ID starting with d04b47e2f03a50deb8f2485dd1b8c8b4d2816149ece7cf2250523c709d1278f0 not found: ID does not exist" Apr 16 18:50:31.921773 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.921713 2576 scope.go:117] "RemoveContainer" containerID="cc704189703d70e9432fd07668d4beeb7b5471466df0ef48dc7a875d3a155562" Apr 16 18:50:31.928455 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.928437 2576 scope.go:117] "RemoveContainer" containerID="cc704189703d70e9432fd07668d4beeb7b5471466df0ef48dc7a875d3a155562" Apr 16 18:50:31.928680 ip-10-0-143-51 kubenswrapper[2576]: E0416 18:50:31.928665 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc704189703d70e9432fd07668d4beeb7b5471466df0ef48dc7a875d3a155562\": container with ID starting with cc704189703d70e9432fd07668d4beeb7b5471466df0ef48dc7a875d3a155562 not found: ID does not exist" containerID="cc704189703d70e9432fd07668d4beeb7b5471466df0ef48dc7a875d3a155562" Apr 16 18:50:31.928727 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.928685 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc704189703d70e9432fd07668d4beeb7b5471466df0ef48dc7a875d3a155562"} err="failed to get container status \"cc704189703d70e9432fd07668d4beeb7b5471466df0ef48dc7a875d3a155562\": rpc error: code = NotFound desc = could not find container \"cc704189703d70e9432fd07668d4beeb7b5471466df0ef48dc7a875d3a155562\": container with ID starting with cc704189703d70e9432fd07668d4beeb7b5471466df0ef48dc7a875d3a155562 not found: ID does not exist" Apr 16 18:50:31.931692 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.931672 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r"] Apr 16 18:50:31.935525 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.935504 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1ac5a-predictor-67ff7bfc58-mgh7r"] Apr 16 18:50:31.945845 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.945822 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p"] Apr 16 18:50:31.950918 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:31.950893 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1ac5a-predictor-697b8f655f-5nh8p"] Apr 16 18:50:33.899533 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:33.899502 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="240de76b-805e-45c1-a36a-5c49c583f087" path="/var/lib/kubelet/pods/240de76b-805e-45c1-a36a-5c49c583f087/volumes" Apr 16 18:50:33.899966 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:50:33.899722 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbea509-f542-489a-85da-ec07e5fa4820" path="/var/lib/kubelet/pods/7bbea509-f542-489a-85da-ec07e5fa4820/volumes" Apr 16 18:58:02.430692 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:02.430615 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv"] Apr 16 18:58:02.433043 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:02.430852 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" podUID="2e645ea6-f77f-4abb-86cd-c23525ea395e" containerName="kserve-container" containerID="cri-o://4b4eeafeb9a99189ef9fdd918f467c8fda8c54e8ff82f465440ceb875bd96865" gracePeriod=30 Apr 16 18:58:02.494039 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:02.494005 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6"] Apr 16 18:58:02.494315 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:02.494279 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" podUID="76f4850c-8d86-4234-926d-3f0f9b821487" containerName="kserve-container" containerID="cri-o://d88d28430f0c1586b15f25e653580ece7fbe9d4d1b240cbb325bde2119e7c3bb" gracePeriod=30 Apr 16 18:58:03.987188 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:03.987142 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" podUID="2e645ea6-f77f-4abb-86cd-c23525ea395e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 18:58:05.379417 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:05.379383 2576 generic.go:358] "Generic (PLEG): container finished" podID="2e645ea6-f77f-4abb-86cd-c23525ea395e" containerID="4b4eeafeb9a99189ef9fdd918f467c8fda8c54e8ff82f465440ceb875bd96865" exitCode=0 Apr 16 18:58:05.379815 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:05.379423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" event={"ID":"2e645ea6-f77f-4abb-86cd-c23525ea395e","Type":"ContainerDied","Data":"4b4eeafeb9a99189ef9fdd918f467c8fda8c54e8ff82f465440ceb875bd96865"} Apr 16 18:58:05.381005 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:05.380980 2576 generic.go:358] "Generic (PLEG): container finished" podID="76f4850c-8d86-4234-926d-3f0f9b821487" containerID="d88d28430f0c1586b15f25e653580ece7fbe9d4d1b240cbb325bde2119e7c3bb" exitCode=0 Apr 16 18:58:05.381107 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:05.381058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" event={"ID":"76f4850c-8d86-4234-926d-3f0f9b821487","Type":"ContainerDied","Data":"d88d28430f0c1586b15f25e653580ece7fbe9d4d1b240cbb325bde2119e7c3bb"} Apr 16 18:58:05.479344 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:05.479319 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" Apr 16 18:58:05.822387 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:05.822364 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" Apr 16 18:58:06.386930 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:06.386891 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" event={"ID":"2e645ea6-f77f-4abb-86cd-c23525ea395e","Type":"ContainerDied","Data":"c2ddb1f9450d246100d0b9f61a8ff4a7908176e67470985229c64ebd2467fedb"} Apr 16 18:58:06.386930 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:06.386925 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv" Apr 16 18:58:06.387463 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:06.386949 2576 scope.go:117] "RemoveContainer" containerID="4b4eeafeb9a99189ef9fdd918f467c8fda8c54e8ff82f465440ceb875bd96865" Apr 16 18:58:06.388120 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:06.388091 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" event={"ID":"76f4850c-8d86-4234-926d-3f0f9b821487","Type":"ContainerDied","Data":"0fd2d585ab918056c26d770e386f3fa6ce1049e1a5b28ce6d5ecdce40f92d1f2"} Apr 16 18:58:06.388180 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:06.388142 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6" Apr 16 18:58:06.394957 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:06.394937 2576 scope.go:117] "RemoveContainer" containerID="d88d28430f0c1586b15f25e653580ece7fbe9d4d1b240cbb325bde2119e7c3bb" Apr 16 18:58:06.404355 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:06.404334 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv"] Apr 16 18:58:06.409540 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:06.409514 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dbc56-predictor-767bcc6cfb-qjprv"] Apr 16 18:58:06.418781 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:06.418758 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6"] Apr 16 18:58:06.422219 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:06.422200 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dbc56-predictor-7cc999bf45-7bxj6"] Apr 16 18:58:07.899055 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:07.899026 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e645ea6-f77f-4abb-86cd-c23525ea395e" path="/var/lib/kubelet/pods/2e645ea6-f77f-4abb-86cd-c23525ea395e/volumes" Apr 16 18:58:07.899514 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:07.899266 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76f4850c-8d86-4234-926d-3f0f9b821487" path="/var/lib/kubelet/pods/76f4850c-8d86-4234-926d-3f0f9b821487/volumes" Apr 16 18:58:30.885910 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:30.885875 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-slcjb_b097be47-4dfd-4fcd-a8ab-78a2cd491538/global-pull-secret-syncer/0.log" Apr 16 18:58:30.972216 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:30.972159 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lsx75_7a7a48be-c9f2-4cfa-8741-c0bfba5190ba/konnectivity-agent/0.log" Apr 16 18:58:31.090818 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:31.090768 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-51.ec2.internal_9a7b007f7f10e6a8374521e6218a633f/haproxy/0.log" Apr 16 18:58:34.794288 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:34.794254 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-dwchl_343458e2-2056-4ac4-9044-aacda978bc94/kube-state-metrics/0.log" Apr 16 18:58:34.823976 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:34.823941 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-dwchl_343458e2-2056-4ac4-9044-aacda978bc94/kube-rbac-proxy-main/0.log" Apr 16 18:58:34.845042 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:34.845013 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-dwchl_343458e2-2056-4ac4-9044-aacda978bc94/kube-rbac-proxy-self/0.log" Apr 16 18:58:34.876619 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:34.876588 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-59f5c7d586-sdjkg_ec18e6e3-88bf-44f7-bf1d-bb7bad347114/metrics-server/0.log" Apr 16 18:58:34.907688 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:34.907660 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-nd6pb_bf022871-f563-464d-9350-a198a2295c7f/monitoring-plugin/0.log" Apr 16 18:58:35.118609 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:35.118529 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zg2cq_5cb51374-d2cd-4fec-8477-88d4c3d6d74a/node-exporter/0.log" Apr 16 18:58:35.146839 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:35.146811 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zg2cq_5cb51374-d2cd-4fec-8477-88d4c3d6d74a/kube-rbac-proxy/0.log" Apr 16 18:58:35.174759 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:35.174734 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zg2cq_5cb51374-d2cd-4fec-8477-88d4c3d6d74a/init-textfile/0.log" Apr 16 18:58:35.659645 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:35.659613 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f49994ccb-rb9jz_252e09bb-e8cc-4008-9e5a-8c49ae1beec9/thanos-query/0.log" Apr 16 18:58:35.683224 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:35.683197 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f49994ccb-rb9jz_252e09bb-e8cc-4008-9e5a-8c49ae1beec9/kube-rbac-proxy-web/0.log" Apr 16 18:58:35.706125 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:35.706097 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f49994ccb-rb9jz_252e09bb-e8cc-4008-9e5a-8c49ae1beec9/kube-rbac-proxy/0.log" Apr 16 18:58:35.731328 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:35.731297 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f49994ccb-rb9jz_252e09bb-e8cc-4008-9e5a-8c49ae1beec9/prom-label-proxy/0.log" Apr 16 18:58:35.755475 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:35.755454 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f49994ccb-rb9jz_252e09bb-e8cc-4008-9e5a-8c49ae1beec9/kube-rbac-proxy-rules/0.log" Apr 16 18:58:35.789402 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:35.789356 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f49994ccb-rb9jz_252e09bb-e8cc-4008-9e5a-8c49ae1beec9/kube-rbac-proxy-metrics/0.log" Apr 16 18:58:38.219391 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.219359 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs"] Apr 16 18:58:38.219855 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.219822 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76f4850c-8d86-4234-926d-3f0f9b821487" containerName="kserve-container" Apr 16 18:58:38.219855 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.219840 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f4850c-8d86-4234-926d-3f0f9b821487" containerName="kserve-container" Apr 16 18:58:38.219974 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.219858 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f829679-ab15-4bb0-8524-a33bb5964964" containerName="kserve-container" Apr 16 18:58:38.219974 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.219867 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f829679-ab15-4bb0-8524-a33bb5964964" containerName="kserve-container" Apr 16 18:58:38.219974 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.219882 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4" containerName="kserve-container" Apr 16 18:58:38.219974 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.219893 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4" containerName="kserve-container" Apr 16 18:58:38.219974 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.219905 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="240de76b-805e-45c1-a36a-5c49c583f087" containerName="kserve-container" Apr 16 18:58:38.219974 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.219913 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="240de76b-805e-45c1-a36a-5c49c583f087" containerName="kserve-container" Apr 16 18:58:38.219974 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.219927 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e645ea6-f77f-4abb-86cd-c23525ea395e" containerName="kserve-container" Apr 16 18:58:38.219974 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.219935 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e645ea6-f77f-4abb-86cd-c23525ea395e" containerName="kserve-container" Apr 16 18:58:38.219974 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.219946 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bbea509-f542-489a-85da-ec07e5fa4820" containerName="kserve-container" Apr 16 18:58:38.219974 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.219954 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbea509-f542-489a-85da-ec07e5fa4820" containerName="kserve-container" Apr 16 18:58:38.220451 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.220045 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e645ea6-f77f-4abb-86cd-c23525ea395e" containerName="kserve-container" Apr 16 18:58:38.220451 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.220059 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bbea509-f542-489a-85da-ec07e5fa4820" containerName="kserve-container" Apr 16 18:58:38.220451 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.220070 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f829679-ab15-4bb0-8524-a33bb5964964" containerName="kserve-container" Apr 16 18:58:38.220451 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.220083 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="76f4850c-8d86-4234-926d-3f0f9b821487" containerName="kserve-container" Apr 16 18:58:38.220451 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.220094 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc9a9ac1-318b-438b-8ab5-13cafbc3a4d4" containerName="kserve-container" Apr 16 18:58:38.220451 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.220103 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="240de76b-805e-45c1-a36a-5c49c583f087" containerName="kserve-container" Apr 16 18:58:38.223113 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.223095 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.225852 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.225836 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lv8lf\"/\"openshift-service-ca.crt\"" Apr 16 18:58:38.226505 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.226487 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lv8lf\"/\"default-dockercfg-r8bqr\"" Apr 16 18:58:38.226620 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.226511 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lv8lf\"/\"kube-root-ca.crt\"" Apr 16 18:58:38.231982 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.231956 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs"] Apr 16 18:58:38.316560 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.316518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/650d47b5-9038-403a-a888-ec270a94a288-sys\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.316560 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.316563 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/650d47b5-9038-403a-a888-ec270a94a288-lib-modules\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.316768 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.316592 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/650d47b5-9038-403a-a888-ec270a94a288-proc\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.316768 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.316636 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/650d47b5-9038-403a-a888-ec270a94a288-podres\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.316768 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.316735 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn5fk\" (UniqueName: \"kubernetes.io/projected/650d47b5-9038-403a-a888-ec270a94a288-kube-api-access-jn5fk\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.417495 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.417460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/650d47b5-9038-403a-a888-ec270a94a288-sys\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.417495 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.417493 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/650d47b5-9038-403a-a888-ec270a94a288-lib-modules\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.417729 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.417514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/650d47b5-9038-403a-a888-ec270a94a288-proc\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.417729 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.417529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/650d47b5-9038-403a-a888-ec270a94a288-podres\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.417729 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.417572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jn5fk\" (UniqueName: \"kubernetes.io/projected/650d47b5-9038-403a-a888-ec270a94a288-kube-api-access-jn5fk\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.417729 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.417607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/650d47b5-9038-403a-a888-ec270a94a288-sys\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.417729 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.417620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/650d47b5-9038-403a-a888-ec270a94a288-proc\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.417729 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.417657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/650d47b5-9038-403a-a888-ec270a94a288-lib-modules\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.417729 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.417676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/650d47b5-9038-403a-a888-ec270a94a288-podres\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.426106 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.426083 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn5fk\" (UniqueName: \"kubernetes.io/projected/650d47b5-9038-403a-a888-ec270a94a288-kube-api-access-jn5fk\") pod \"perf-node-gather-daemonset-b84qs\" (UID: \"650d47b5-9038-403a-a888-ec270a94a288\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.533744 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.533652 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:38.653361 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.653326 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs"] Apr 16 18:58:38.656926 ip-10-0-143-51 kubenswrapper[2576]: W0416 18:58:38.656893 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod650d47b5_9038_403a_a888_ec270a94a288.slice/crio-b8c57b070cb497e8347f3471f6c2c8a39e10610269d2cec5363ba20f8270d5de WatchSource:0}: Error finding container b8c57b070cb497e8347f3471f6c2c8a39e10610269d2cec5363ba20f8270d5de: Status 404 returned error can't find the container with id b8c57b070cb497e8347f3471f6c2c8a39e10610269d2cec5363ba20f8270d5de Apr 16 18:58:38.658530 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:38.658513 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:58:39.158433 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:39.158409 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jxhl9_206d1627-bacb-468e-ab0d-e0782a1f13d1/dns/0.log" Apr 16 18:58:39.181383 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:39.181359 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jxhl9_206d1627-bacb-468e-ab0d-e0782a1f13d1/kube-rbac-proxy/0.log" Apr 16 18:58:39.371344 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:39.371314 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tqcl9_62ab651d-790e-48bc-91c8-e40eada59965/dns-node-resolver/0.log" Apr 16 18:58:39.494815 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:39.494727 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" event={"ID":"650d47b5-9038-403a-a888-ec270a94a288","Type":"ContainerStarted","Data":"c297e41510240ab5ab418f9b5362d70ac56d281978eec5af0e7e920e16cbec4d"} Apr 16 18:58:39.494815 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:39.494768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" event={"ID":"650d47b5-9038-403a-a888-ec270a94a288","Type":"ContainerStarted","Data":"b8c57b070cb497e8347f3471f6c2c8a39e10610269d2cec5363ba20f8270d5de"} Apr 16 18:58:39.494995 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:39.494873 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:39.513086 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:39.513031 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" podStartSLOduration=1.5130131919999998 podStartE2EDuration="1.513013192s" podCreationTimestamp="2026-04-16 18:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:58:39.511985978 +0000 UTC m=+2920.215317501" watchObservedRunningTime="2026-04-16 18:58:39.513013192 +0000 UTC m=+2920.216344717" Apr 16 18:58:39.842916 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:39.842823 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-f98dfdd99-ll2gn_75b1346f-cd61-4f00-9522-07bd0a521fc6/registry/0.log" Apr 16 18:58:39.915578 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:39.915545 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xcxgk_969cd886-bdfc-46ca-ab57-08cca0abed0b/node-ca/0.log" Apr 16 18:58:41.060916 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:41.060882 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-glzcp_33d3f9c8-7acd-4da5-93e7-1274c864ad1c/serve-healthcheck-canary/0.log" Apr 16 18:58:41.645696 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:41.645667 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v2wwf_c98a1bb4-edd0-485a-b6fc-87204ad0e0dc/kube-rbac-proxy/0.log" Apr 16 18:58:41.677295 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:41.677264 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v2wwf_c98a1bb4-edd0-485a-b6fc-87204ad0e0dc/exporter/0.log" Apr 16 18:58:41.703899 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:41.703872 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v2wwf_c98a1bb4-edd0-485a-b6fc-87204ad0e0dc/extractor/0.log" Apr 16 18:58:43.666917 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:43.666891 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-65589c6846-s4v7h_92b32003-ca4f-49cb-8b14-026672e56812/manager/0.log" Apr 16 18:58:43.688672 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:43.688647 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-dvkwt_24e1fec6-80d5-4325-a211-d58a433fde30/manager/0.log" Apr 16 18:58:43.711319 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:43.711294 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-nrc5q_a238435e-74d0-4e28-be75-30402f705cc8/server/0.log" Apr 16 18:58:44.251847 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:44.251817 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-q4gnz_646b4080-16ab-45ab-be6e-1183ea42a5d3/seaweedfs/0.log" Apr 16 18:58:45.507404 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:45.507377 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-b84qs" Apr 16 18:58:50.240391 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:50.240364 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fkzwr_8670fc01-fa98-49d3-8a70-5fe409cb46a1/kube-multus-additional-cni-plugins/0.log" Apr 16 18:58:50.278324 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:50.278292 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fkzwr_8670fc01-fa98-49d3-8a70-5fe409cb46a1/egress-router-binary-copy/0.log" Apr 16 18:58:50.318211 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:50.318186 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fkzwr_8670fc01-fa98-49d3-8a70-5fe409cb46a1/cni-plugins/0.log" Apr 16 18:58:50.360108 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:50.360080 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fkzwr_8670fc01-fa98-49d3-8a70-5fe409cb46a1/bond-cni-plugin/0.log" Apr 16 18:58:50.390985 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:50.390962 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fkzwr_8670fc01-fa98-49d3-8a70-5fe409cb46a1/routeoverride-cni/0.log" Apr 16 18:58:50.425139 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:50.425112 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fkzwr_8670fc01-fa98-49d3-8a70-5fe409cb46a1/whereabouts-cni-bincopy/0.log" Apr 16 18:58:50.460171 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:50.460139 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fkzwr_8670fc01-fa98-49d3-8a70-5fe409cb46a1/whereabouts-cni/0.log" Apr 16 18:58:50.774015 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:50.773983 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d24gh_07c55bf2-6978-4a9f-ace9-a376fd86df33/kube-multus/0.log" Apr 16 18:58:50.835188 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:50.835159 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-687m2_abb881c2-5bd6-4f73-a490-2665c9449ae7/network-metrics-daemon/0.log" Apr 16 18:58:50.860524 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:50.860493 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-687m2_abb881c2-5bd6-4f73-a490-2665c9449ae7/kube-rbac-proxy/0.log" Apr 16 18:58:52.493766 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:52.493735 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkbc8_ced49c9c-a339-4f7e-9970-0aabc7b2d765/ovn-controller/0.log" Apr 16 18:58:52.544897 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:52.544869 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkbc8_ced49c9c-a339-4f7e-9970-0aabc7b2d765/ovn-acl-logging/0.log" Apr 16 18:58:52.565495 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:52.565463 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkbc8_ced49c9c-a339-4f7e-9970-0aabc7b2d765/kube-rbac-proxy-node/0.log" Apr 16 18:58:52.591316 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:52.591285 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkbc8_ced49c9c-a339-4f7e-9970-0aabc7b2d765/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:58:52.614180 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:52.614139 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkbc8_ced49c9c-a339-4f7e-9970-0aabc7b2d765/northd/0.log" Apr 16 18:58:52.636659 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:52.636636 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkbc8_ced49c9c-a339-4f7e-9970-0aabc7b2d765/nbdb/0.log" Apr 16 18:58:52.660736 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:52.660700 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkbc8_ced49c9c-a339-4f7e-9970-0aabc7b2d765/sbdb/0.log" Apr 16 18:58:52.870050 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:52.869967 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkbc8_ced49c9c-a339-4f7e-9970-0aabc7b2d765/ovnkube-controller/0.log" Apr 16 18:58:53.893177 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:53.893137 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qhbm6_795df744-5070-4941-a2b6-f01fc85241b9/network-check-target-container/0.log" Apr 16 18:58:54.864517 ip-10-0-143-51 kubenswrapper[2576]: I0416 18:58:54.864490 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-snh2z_32938f07-abc4-4f36-9ffc-6472e5b05222/iptables-alerter/0.log"