Apr 16 18:09:59.286892 ip-10-0-128-95 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:09:59.739640 ip-10-0-128-95 kubenswrapper[2583]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:59.739640 ip-10-0-128-95 kubenswrapper[2583]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:09:59.739640 ip-10-0-128-95 kubenswrapper[2583]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:59.739640 ip-10-0-128-95 kubenswrapper[2583]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:09:59.739640 ip-10-0-128-95 kubenswrapper[2583]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:09:59.742097 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.742009 2583 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:09:59.745819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745805 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:59.745819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745820 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745823 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745827 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745830 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745835 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745839 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745843 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745845 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745848 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745852 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745855 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745858 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745860 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745863 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745865 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745868 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745870 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745873 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745876 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:59.745881 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745879 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745881 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745884 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745887 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745890 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745893 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745895 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745899 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745902 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745904 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745907 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745910 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745912 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745915 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745917 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745919 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745922 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745924 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745927 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745929 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:59.746332 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745931 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745934 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745937 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745939 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745942 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745944 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745947 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745949 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745952 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745954 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745958 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745960 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745963 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745965 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745968 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745971 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745974 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745976 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745979 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745983 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:59.746832 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745986 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745989 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745992 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745995 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.745998 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746001 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746004 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746007 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746010 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746014 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746018 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746022 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746024 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746027 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746029 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746032 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746034 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746037 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746039 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746042 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:59.747325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746044 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746047 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746049 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746052 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746054 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746056 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746423 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746428 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746431 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746449 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746452 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746455 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746458 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746460 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746463 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746466 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746469 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746472 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746474 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746477 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:59.747817 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746479 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746482 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746485 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746488 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746490 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746493 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746495 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746498 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746501 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746504 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746506 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746509 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746511 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746513 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746516 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746518 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746521 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746524 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746528 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746531 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746534 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:59.748299 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746536 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746539 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746541 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746544 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746546 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746549 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746551 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746554 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746556 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746559 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746561 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746564 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746566 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746569 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746571 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746574 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746592 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746595 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746597 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746600 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:59.748830 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746602 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746605 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746607 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746610 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746612 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746616 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746620 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746622 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746625 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746628 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746631 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746633 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746636 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746639 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746641 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746644 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746646 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746649 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746651 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:59.749330 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746654 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746656 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746661 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746665 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746668 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746670 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746673 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746676 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746678 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746681 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746684 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.746687 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747444 2583 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747457 2583 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747465 2583 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747469 2583 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747474 2583 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747477 2583 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747482 2583 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747486 2583 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:09:59.749819 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747490 2583 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747493 2583 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747497 2583 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747500 2583 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747503 2583 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747506 2583 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747509 2583 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747512 2583 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747515 2583 flags.go:64] FLAG: --cloud-config="" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747518 2583 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747521 2583 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747525 2583 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747527 2583 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747531 2583 flags.go:64] FLAG: --config-dir="" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747533 2583 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747538 2583 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747542 2583 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747545 2583 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747548 2583 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747551 2583 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747554 2583 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747557 2583 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747560 2583 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747563 2583 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747566 2583 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:09:59.750331 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747570 2583 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747573 2583 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747589 2583 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747592 2583 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747596 2583 flags.go:64] FLAG: --enable-server="true" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747599 2583 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747605 2583 flags.go:64] FLAG: --event-burst="100" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747608 2583 flags.go:64] FLAG: --event-qps="50" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747611 2583 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747615 2583 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747618 2583 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747622 2583 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747625 2583 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747628 2583 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747631 2583 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747634 2583 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747637 2583 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747640 2583 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747643 2583 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747645 2583 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747648 2583 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747651 2583 flags.go:64] FLAG: --feature-gates="" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747655 2583 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747659 2583 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747662 2583 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:09:59.750968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747665 2583 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747668 2583 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747672 2583 flags.go:64] FLAG: --help="false" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747674 2583 flags.go:64] FLAG: --hostname-override="ip-10-0-128-95.ec2.internal" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747677 2583 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747681 2583 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747684 2583 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747687 2583 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747690 2583 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747693 2583 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747696 2583 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747703 2583 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747706 2583 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747708 2583 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747712 2583 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747714 2583 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747718 2583 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747721 2583 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747724 2583 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747727 2583 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747730 2583 flags.go:64] FLAG: --lock-file="" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747733 2583 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747735 2583 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747739 2583 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:09:59.751622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747744 2583 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747746 2583 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747749 2583 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747752 2583 flags.go:64] FLAG: --logging-format="text" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747755 2583 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747759 2583 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747761 2583 flags.go:64] FLAG: --manifest-url="" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747764 2583 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747772 2583 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747775 2583 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747779 2583 flags.go:64] FLAG: --max-pods="110" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747782 2583 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747785 2583 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747788 2583 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747791 2583 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747794 2583 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747797 2583 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747800 2583 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747808 2583 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747813 2583 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747816 2583 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747819 2583 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747822 2583 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:09:59.752198 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747828 2583 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747831 2583 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747834 2583 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747838 2583 flags.go:64] FLAG: --port="10250" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747841 2583 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747844 2583 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08c786e50f395ee43" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747847 2583 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747850 2583 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747853 2583 flags.go:64] FLAG: --register-node="true" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747856 2583 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747859 2583 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747862 2583 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747865 2583 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747868 2583 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747870 2583 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747874 2583 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747877 2583 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747881 2583 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747883 2583 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747886 2583 flags.go:64] FLAG: --runonce="false" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747889 2583 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747892 2583 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747895 2583 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747898 2583 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747901 2583 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747904 2583 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:09:59.752776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747907 2583 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747910 2583 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747912 2583 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747916 2583 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747919 2583 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747921 2583 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747924 2583 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747927 2583 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747930 2583 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747935 2583 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747938 2583 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747940 2583 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747944 2583 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747947 2583 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747950 2583 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747952 2583 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747955 2583 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747958 2583 flags.go:64] FLAG: --v="2" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747962 2583 flags.go:64] FLAG: --version="false" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747966 2583 flags.go:64] FLAG: --vmodule="" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747970 2583 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.747973 2583 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748080 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748084 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:59.753513 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748089 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748103 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748107 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748110 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748112 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748115 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748118 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748121 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748124 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748127 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748129 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748132 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748135 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748138 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748140 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748143 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748146 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748149 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748151 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:59.754112 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748154 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748158 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748161 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748164 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748167 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748169 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748172 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748174 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748177 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748179 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748182 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748184 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748187 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748189 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748193 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748195 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748198 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748200 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748203 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748205 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:59.754619 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748208 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748211 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748213 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748216 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748220 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748223 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748226 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748229 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748232 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748235 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748237 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748240 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748242 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748245 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748247 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748250 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748252 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748255 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748257 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748260 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:59.755337 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748262 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748265 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748267 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748270 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748275 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748277 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748280 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748283 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748285 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748288 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748291 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748293 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748296 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748299 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748301 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748304 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748308 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748310 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748313 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748315 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:59.755935 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748318 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748320 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748323 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748325 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.748328 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.748333 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.756181 2583 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.756197 2583 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756245 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756250 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756254 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756257 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756260 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756263 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756265 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:59.756437 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756270 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756275 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756279 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756282 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756285 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756287 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756290 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756293 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756296 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756299 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756302 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756304 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756307 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756310 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756314 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756318 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756320 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756322 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756325 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:59.756863 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756328 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756330 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756333 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756335 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756338 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756341 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756344 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756346 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756349 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756352 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756354 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756357 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756360 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756363 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756365 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756368 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756370 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756373 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756376 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756378 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:59.757327 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756380 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756383 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756386 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756388 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756391 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756393 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756396 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756399 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756401 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756404 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756406 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756409 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756411 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756414 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756417 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756419 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756422 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756424 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756427 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756430 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:59.757841 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756432 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756435 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756438 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756440 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756443 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756446 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756449 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756452 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756454 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756457 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756460 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756462 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756465 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756468 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756470 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756473 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756475 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756478 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756480 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:59.758325 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756483 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.756488 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756599 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756606 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756609 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756612 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756614 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756617 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756620 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756623 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756626 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756628 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756631 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756633 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756636 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:09:59.758827 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756639 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756641 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756644 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756647 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756650 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756652 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756655 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756657 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756660 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756663 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756665 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756668 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756670 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756673 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756675 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756677 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756680 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756682 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756685 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756687 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:09:59.759209 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756690 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756692 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756694 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756697 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756699 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756702 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756704 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756707 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756710 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756712 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756715 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756717 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756721 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756724 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756727 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756730 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756733 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756736 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756739 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:09:59.759731 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756741 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756743 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756746 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756750 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756752 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756754 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756757 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756759 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756762 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756765 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756767 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756770 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756772 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756774 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756777 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756779 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756783 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756786 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756789 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756792 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:09:59.760205 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756794 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756797 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756799 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756802 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756805 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756807 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756810 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756812 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756815 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756818 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756820 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756823 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756825 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:09:59.756828 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.756832 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:09:59.760751 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.757605 2583 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:09:59.761180 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.760016 2583 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:09:59.761180 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.760974 2583 server.go:1019] "Starting client certificate rotation" Apr 16 18:09:59.761180 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.761082 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:09:59.761180 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.761130 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:09:59.786278 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.786253 2583 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:09:59.790859 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.790839 2583 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:09:59.808891 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.808869 2583 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:09:59.815148 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.815130 2583 log.go:25] "Validated CRI v1 image API" Apr 16 18:09:59.816685 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.816667 2583 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:09:59.821736 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.821714 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:09:59.822321 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.822301 2583 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 bc9dbfaa-f5e2-44a9-98f5-f4e406be77d7:/dev/nvme0n1p3 d285b1e6-d526-4c89-8553-f0d3d042e395:/dev/nvme0n1p4] Apr 16 18:09:59.822373 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.822321 2583 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:09:59.827475 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.827366 2583 manager.go:217] Machine: {Timestamp:2026-04-16 18:09:59.826032638 +0000 UTC m=+0.415230914 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100781 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21f985a286dc45d30695c383812d5e SystemUUID:ec21f985-a286-dc45-d306-95c383812d5e BootID:dacedce0-0f8c-4783-8fcb-6bc18e8c1fb4 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:00:e2:6a:0c:6d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:00:e2:6a:0c:6d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0e:43:d1:f8:10:af Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:09:59.827475 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.827469 2583 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:09:59.827601 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.827556 2583 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:09:59.827938 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.827912 2583 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:09:59.828107 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.827940 2583 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-95.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:09:59.828149 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.828117 2583 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:09:59.828149 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.828126 2583 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:09:59.828149 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.828139 2583 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:09:59.828228 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.828155 2583 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:09:59.829401 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.829389 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:09:59.829509 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.829501 2583 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:09:59.832027 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.832017 2583 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:09:59.832933 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.832923 2583 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:09:59.833701 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.833691 2583 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:09:59.833732 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.833705 2583 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:09:59.833732 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.833714 2583 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:09:59.834958 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.834938 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:09:59.835317 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.834969 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:09:59.838161 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.838141 2583 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:09:59.839864 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.839850 2583 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:09:59.841214 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.841199 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:09:59.841289 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.841217 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:09:59.841289 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.841223 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:09:59.841289 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.841231 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:09:59.841289 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.841239 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:09:59.841289 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.841247 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:09:59.841289 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.841257 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:09:59.841289 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.841263 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:09:59.841289 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.841271 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:09:59.841289 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.841277 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:09:59.841289 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.841286 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:09:59.841289 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.841295 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:09:59.842286 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.842276 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:09:59.842286 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.842286 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:09:59.843293 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:09:59.843264 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-95.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:09:59.843989 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:09:59.843963 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:09:59.846176 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.846159 2583 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-95.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:09:59.846274 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.846261 2583 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:09:59.846307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.846298 2583 server.go:1295] "Started kubelet" Apr 16 18:09:59.846540 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.846494 2583 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:09:59.846540 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.846516 2583 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:09:59.846694 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.846574 2583 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:09:59.847279 ip-10-0-128-95 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:09:59.849236 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.849220 2583 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:09:59.850316 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.850286 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ccjrz" Apr 16 18:09:59.850405 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.850366 2583 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:09:59.851707 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:09:59.850687 2583 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-95.ec2.internal.18a6e8bc9dfc0a13 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-95.ec2.internal,UID:ip-10-0-128-95.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-95.ec2.internal,},FirstTimestamp:2026-04-16 18:09:59.846275603 +0000 UTC m=+0.435473879,LastTimestamp:2026-04-16 18:09:59.846275603 +0000 UTC m=+0.435473879,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-95.ec2.internal,}" Apr 16 18:09:59.855103 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:09:59.855083 2583 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:09:59.856573 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.856555 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:09:59.857189 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.857173 2583 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:09:59.859742 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.859492 2583 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:09:59.859856 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.859845 2583 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:09:59.859946 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:09:59.859800 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:09:59.860001 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.859745 2583 factory.go:55] Registering systemd factory Apr 16 18:09:59.860001 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.859970 2583 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:09:59.860076 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.859951 2583 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:09:59.860119 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.860088 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ccjrz" Apr 16 18:09:59.860271 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.860250 2583 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:09:59.860271 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.860270 2583 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:09:59.860714 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.860697 2583 factory.go:153] Registering CRI-O factory Apr 16 18:09:59.860714 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.860715 2583 factory.go:223] Registration of the crio container factory successfully Apr 16 18:09:59.860857 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.860782 2583 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:09:59.860857 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.860807 2583 factory.go:103] Registering Raw factory Apr 16 18:09:59.860857 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.860822 2583 manager.go:1196] Started watching for new ooms in manager Apr 16 18:09:59.861211 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.861196 2583 manager.go:319] Starting recovery of all containers Apr 16 18:09:59.862234 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:09:59.862206 2583 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-95.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:09:59.872432 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.872304 2583 manager.go:324] Recovery completed Apr 16 18:09:59.872686 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.872665 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:09:59.876463 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.876449 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:09:59.878694 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.878680 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:09:59.878777 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.878711 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:09:59.878777 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.878722 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:09:59.879281 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.879270 2583 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:09:59.879281 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.879279 2583 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:09:59.879376 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.879294 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:09:59.881638 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.881626 2583 policy_none.go:49] "None policy: Start" Apr 16 18:09:59.881689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.881642 2583 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:09:59.881689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.881652 2583 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:09:59.928206 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.928191 2583 manager.go:341] "Starting Device Plugin manager" Apr 16 18:09:59.929208 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:09:59.928228 2583 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:09:59.929208 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.928242 2583 server.go:85] "Starting device plugin registration server" Apr 16 18:09:59.929208 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.928493 2583 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:09:59.929208 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.928506 2583 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:09:59.929208 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.928603 2583 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:09:59.929208 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.928704 2583 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:09:59.929208 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.928715 2583 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:09:59.929540 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:09:59.929266 2583 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:09:59.929540 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:09:59.929297 2583 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:09:59.985371 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.985328 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:09:59.986731 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.986712 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:09:59.986866 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.986746 2583 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:09:59.986866 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.986778 2583 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:09:59.986866 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.986789 2583 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:09:59.986866 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:09:59.986832 2583 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:09:59.991199 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:09:59.991143 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:00.029158 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.029128 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:00.030202 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.030184 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:00.030304 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.030215 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:00.030304 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.030225 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:00.030304 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.030248 2583 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.046426 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.046401 2583 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.046525 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:00.046429 2583 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-95.ec2.internal\": node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:10:00.085649 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:00.085618 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:10:00.087921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.087903 2583 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-95.ec2.internal"] Apr 16 18:10:00.087980 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.087969 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:00.089447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.089433 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:00.089505 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.089464 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:00.089505 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.089476 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:00.090705 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.090693 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:00.090879 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.090865 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.090925 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.090895 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:00.091425 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.091408 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:00.091484 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.091435 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:00.091484 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.091449 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:00.091484 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.091452 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:00.091484 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.091467 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:00.091484 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.091478 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:00.093079 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.093065 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.093125 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.093091 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:10:00.093798 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.093783 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:10:00.093887 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.093815 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:10:00.093887 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.093828 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:10:00.109187 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:00.109164 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-95.ec2.internal\" not found" node="ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.113638 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:00.113622 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-95.ec2.internal\" not found" node="ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.161375 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.161340 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/927f711b1ee698a25d6a2103f40d71ed-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal\" (UID: \"927f711b1ee698a25d6a2103f40d71ed\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.161375 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.161376 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/927f711b1ee698a25d6a2103f40d71ed-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal\" (UID: \"927f711b1ee698a25d6a2103f40d71ed\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.161541 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.161395 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1c631bb54e7f931dcf513f44e89f6bf7-config\") pod \"kube-apiserver-proxy-ip-10-0-128-95.ec2.internal\" (UID: \"1c631bb54e7f931dcf513f44e89f6bf7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.186275 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:00.186249 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:10:00.262259 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.262173 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/927f711b1ee698a25d6a2103f40d71ed-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal\" (UID: \"927f711b1ee698a25d6a2103f40d71ed\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.262259 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.262211 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/927f711b1ee698a25d6a2103f40d71ed-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal\" (UID: \"927f711b1ee698a25d6a2103f40d71ed\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.262259 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.262235 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1c631bb54e7f931dcf513f44e89f6bf7-config\") pod \"kube-apiserver-proxy-ip-10-0-128-95.ec2.internal\" (UID: \"1c631bb54e7f931dcf513f44e89f6bf7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.262468 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.262273 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/927f711b1ee698a25d6a2103f40d71ed-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal\" (UID: \"927f711b1ee698a25d6a2103f40d71ed\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.262468 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.262287 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1c631bb54e7f931dcf513f44e89f6bf7-config\") pod \"kube-apiserver-proxy-ip-10-0-128-95.ec2.internal\" (UID: \"1c631bb54e7f931dcf513f44e89f6bf7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.262468 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.262291 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/927f711b1ee698a25d6a2103f40d71ed-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal\" (UID: \"927f711b1ee698a25d6a2103f40d71ed\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.287302 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:00.287276 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:10:00.388091 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:00.388051 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:10:00.412184 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.412164 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.416719 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.416699 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-95.ec2.internal" Apr 16 18:10:00.488944 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:00.488893 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:10:00.589596 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:00.589484 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:10:00.690126 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:00.690091 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:10:00.760478 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.760445 2583 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:10:00.761044 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.760659 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:10:00.761044 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.760668 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:10:00.790873 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:00.790837 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:10:00.856778 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.856705 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:10:00.864336 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.864288 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:04:59 +0000 UTC" deadline="2027-10-21 14:55:19.602218206 +0000 UTC" Apr 16 18:10:00.864336 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.864334 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13268h45m18.73788738s" Apr 16 18:10:00.865916 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.865898 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:10:00.891899 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:00.891870 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:10:00.912087 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.912066 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-58pcz" Apr 16 18:10:00.919119 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.919097 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-58pcz" Apr 16 18:10:00.943048 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:00.943017 2583 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:00.992128 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:00.992104 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:10:00.995857 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:00.995828 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c631bb54e7f931dcf513f44e89f6bf7.slice/crio-0f7fca0d4b61eb662fa099e3195494f65f354887e88826ee5b0c0a1934341018 WatchSource:0}: Error finding container 0f7fca0d4b61eb662fa099e3195494f65f354887e88826ee5b0c0a1934341018: Status 404 returned error can't find the container with id 0f7fca0d4b61eb662fa099e3195494f65f354887e88826ee5b0c0a1934341018 Apr 16 18:10:00.996260 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:00.996244 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod927f711b1ee698a25d6a2103f40d71ed.slice/crio-ad35bcafdf7abd73a437d101614c530d5bb5598b313bcdf2a20134ab5d552489 WatchSource:0}: Error finding container ad35bcafdf7abd73a437d101614c530d5bb5598b313bcdf2a20134ab5d552489: Status 404 returned error can't find the container with id ad35bcafdf7abd73a437d101614c530d5bb5598b313bcdf2a20134ab5d552489 Apr 16 18:10:01.001064 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.001050 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:10:01.092228 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:01.092175 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-95.ec2.internal\" not found" Apr 16 18:10:01.179935 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.179917 2583 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:01.257633 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.257552 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-95.ec2.internal" Apr 16 18:10:01.266688 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.266668 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:10:01.267670 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.267657 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal" Apr 16 18:10:01.276490 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.276470 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:10:01.834753 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.834719 2583 apiserver.go:52] "Watching apiserver" Apr 16 18:10:01.840046 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.839994 2583 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:10:01.840714 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.840691 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-k7ztx","kube-system/konnectivity-agent-vhpgc","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h","openshift-cluster-node-tuning-operator/tuned-7jqb4","openshift-dns/node-resolver-hhl4h","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal","openshift-multus/multus-additional-cni-plugins-cscsn","openshift-multus/multus-qtmlc","openshift-network-operator/iptables-alerter-4wkgk","kube-system/kube-apiserver-proxy-ip-10-0-128-95.ec2.internal","openshift-image-registry/node-ca-5tctm","openshift-multus/network-metrics-daemon-chzqx","openshift-network-diagnostics/network-check-target-9rmkx"] Apr 16 18:10:01.843861 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.843835 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.845009 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.844988 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vhpgc" Apr 16 18:10:01.846302 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.846281 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.846553 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.846535 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:10:01.846780 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.846752 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.846906 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.846891 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:10:01.847057 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.847046 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.847129 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.847112 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vwqxk\"" Apr 16 18:10:01.847385 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.847334 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:10:01.847741 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.847551 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dtdqr\"" Apr 16 18:10:01.847741 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.847621 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:10:01.847741 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.847620 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:10:01.848232 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.848213 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.849493 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.848810 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.849493 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.849324 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-g6dg6\"" Apr 16 18:10:01.849493 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.849328 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.850718 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.850696 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.851059 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.851021 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-462fd\"" Apr 16 18:10:01.851829 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.851811 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.856792 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.854914 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hhl4h" Apr 16 18:10:01.856792 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.856493 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:10:01.856792 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.856604 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.857115 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.856915 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-fs5s4\"" Apr 16 18:10:01.857115 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.857008 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.857115 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.857041 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.857247 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.857129 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:10:01.857247 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.857216 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:10:01.857247 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.857235 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:10:01.857339 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.857218 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.857493 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.857475 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:10:01.857822 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.857804 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.857918 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.857826 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-j5k87\"" Apr 16 18:10:01.858269 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.858095 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.859824 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.859804 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:10:01.860032 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.860017 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-x4h8x\"" Apr 16 18:10:01.862077 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.862055 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4wkgk" Apr 16 18:10:01.863778 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.863759 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-rqkcd\"" Apr 16 18:10:01.863960 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.863940 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:10:01.864154 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.864135 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.864381 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.864023 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5tctm" Apr 16 18:10:01.864537 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.864521 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.865917 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.865889 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:01.866021 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:01.865979 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:01.866290 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.866270 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:10:01.866460 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.866447 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:10:01.866714 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.866573 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2b5w5\"" Apr 16 18:10:01.866938 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.866504 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:10:01.868925 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.868897 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:01.869015 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:01.868973 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.870798 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-var-lib-cni-multus\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.870835 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79xgw\" (UniqueName: \"kubernetes.io/projected/c3c85f82-6657-4a18-9364-c3ce61213e8a-kube-api-access-79xgw\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.870865 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-socket-dir\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.870891 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-os-release\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.870916 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-registration-dir\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.870971 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c28c1588-a1f5-4491-bbfc-135c1d264663-tmp-dir\") pod \"node-resolver-hhl4h\" (UID: \"c28c1588-a1f5-4491-bbfc-135c1d264663\") " pod="openshift-dns/node-resolver-hhl4h" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.870999 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9da1b91-a9ae-4adf-ac9f-881e7217faad-ovn-node-metrics-cert\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871038 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-var-lib-cni-bin\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871063 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3c85f82-6657-4a18-9364-c3ce61213e8a-cni-binary-copy\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871085 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c3c85f82-6657-4a18-9364-c3ce61213e8a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871102 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-sys\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871116 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-run-systemd\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871131 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871145 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-cni-bin\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871158 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b4b5a677-a74e-41a0-9821-6b56e1c0328c-agent-certs\") pod \"konnectivity-agent-vhpgc\" (UID: \"b4b5a677-a74e-41a0-9821-6b56e1c0328c\") " pod="kube-system/konnectivity-agent-vhpgc" Apr 16 18:10:01.871603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871180 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-run\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871194 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-tuned\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871211 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-var-lib-openvswitch\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871225 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-run-openvswitch\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871239 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-run-multus-certs\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871257 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhb9t\" (UniqueName: \"kubernetes.io/projected/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-kube-api-access-bhb9t\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871275 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c28c1588-a1f5-4491-bbfc-135c1d264663-hosts-file\") pod \"node-resolver-hhl4h\" (UID: \"c28c1588-a1f5-4491-bbfc-135c1d264663\") " pod="openshift-dns/node-resolver-hhl4h" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871290 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916-iptables-alerter-script\") pod \"iptables-alerter-4wkgk\" (UID: \"fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916\") " pod="openshift-network-operator/iptables-alerter-4wkgk" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871304 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-slash\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871318 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v87jt\" (UniqueName: \"kubernetes.io/projected/e9da1b91-a9ae-4adf-ac9f-881e7217faad-kube-api-access-v87jt\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871333 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-run-k8s-cni-cncf-io\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871359 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-var-lib-kubelet\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871372 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-hostroot\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871397 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9ckw\" (UniqueName: \"kubernetes.io/projected/fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916-kube-api-access-g9ckw\") pod \"iptables-alerter-4wkgk\" (UID: \"fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916\") " pod="openshift-network-operator/iptables-alerter-4wkgk" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871415 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-kubelet\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871430 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-run-ovn-kubernetes\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.872387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871445 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3c85f82-6657-4a18-9364-c3ce61213e8a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871459 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-device-dir\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871492 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-sysctl-conf\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871509 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-etc-openvswitch\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871525 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-run-ovn\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871540 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9da1b91-a9ae-4adf-ac9f-881e7217faad-ovnkube-config\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871555 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-run-netns\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871569 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rghkp\" (UniqueName: \"kubernetes.io/projected/a5fef405-bf92-4991-86fe-f71befb39d59-kube-api-access-rghkp\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871608 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3c85f82-6657-4a18-9364-c3ce61213e8a-cnibin\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871633 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-sys-fs\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871665 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5fef405-bf92-4991-86fe-f71befb39d59-cni-binary-copy\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871697 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-sysconfig\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871719 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-lib-modules\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871745 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-cni-netd\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871769 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-multus-conf-dir\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871801 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a5fef405-bf92-4991-86fe-f71befb39d59-multus-daemon-config\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871825 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-systemd\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.873128 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871849 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-var-lib-kubelet\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871877 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-systemd-units\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871901 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-run-netns\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871925 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-node-log\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871949 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9da1b91-a9ae-4adf-ac9f-881e7217faad-env-overrides\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.871976 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-tmp\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872001 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlbrn\" (UniqueName: \"kubernetes.io/projected/c28c1588-a1f5-4491-bbfc-135c1d264663-kube-api-access-mlbrn\") pod \"node-resolver-hhl4h\" (UID: \"c28c1588-a1f5-4491-bbfc-135c1d264663\") " pod="openshift-dns/node-resolver-hhl4h" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872026 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b4b5a677-a74e-41a0-9821-6b56e1c0328c-konnectivity-ca\") pod \"konnectivity-agent-vhpgc\" (UID: \"b4b5a677-a74e-41a0-9821-6b56e1c0328c\") " pod="kube-system/konnectivity-agent-vhpgc" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872051 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-modprobe-d\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872106 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7qp2\" (UniqueName: \"kubernetes.io/projected/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-kube-api-access-c7qp2\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872134 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-multus-cni-dir\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872161 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3c85f82-6657-4a18-9364-c3ce61213e8a-system-cni-dir\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872187 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3c85f82-6657-4a18-9364-c3ce61213e8a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872212 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872241 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916-host-slash\") pod \"iptables-alerter-4wkgk\" (UID: \"fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916\") " pod="openshift-network-operator/iptables-alerter-4wkgk" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872265 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-kubernetes\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.873920 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872291 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-log-socket\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.874730 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872318 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3c85f82-6657-4a18-9364-c3ce61213e8a-os-release\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.874730 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872343 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-etc-selinux\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.874730 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872375 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-system-cni-dir\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.874730 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872405 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-multus-socket-dir-parent\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.874730 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872431 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-etc-kubernetes\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.874730 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872456 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-sysctl-d\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.874730 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872483 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-host\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.874730 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872508 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e9da1b91-a9ae-4adf-ac9f-881e7217faad-ovnkube-script-lib\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.874730 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.872532 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-cnibin\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.920327 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.920228 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:05:00 +0000 UTC" deadline="2027-10-03 11:35:25.12434114 +0000 UTC" Apr 16 18:10:01.920327 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.920260 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12833h25m23.204084646s" Apr 16 18:10:01.961689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.961661 2583 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:10:01.973703 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.973670 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n9sb\" (UniqueName: \"kubernetes.io/projected/4eacb341-6891-41dc-a3c0-09b5697178ee-kube-api-access-4n9sb\") pod \"network-metrics-daemon-chzqx\" (UID: \"4eacb341-6891-41dc-a3c0-09b5697178ee\") " pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:01.973893 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.973763 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-registration-dir\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.973893 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.973790 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c28c1588-a1f5-4491-bbfc-135c1d264663-tmp-dir\") pod \"node-resolver-hhl4h\" (UID: \"c28c1588-a1f5-4491-bbfc-135c1d264663\") " pod="openshift-dns/node-resolver-hhl4h" Apr 16 18:10:01.973893 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.973823 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9da1b91-a9ae-4adf-ac9f-881e7217faad-ovn-node-metrics-cert\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.973893 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.973864 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-var-lib-cni-bin\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.973893 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.973887 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3c85f82-6657-4a18-9364-c3ce61213e8a-cni-binary-copy\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.974127 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.973902 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c3c85f82-6657-4a18-9364-c3ce61213e8a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.974127 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.973924 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-sys\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.974127 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.973946 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-run-systemd\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.974127 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.973964 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.974127 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.973982 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpkv7\" (UniqueName: \"kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7\") pod \"network-check-target-9rmkx\" (UID: \"d252d242-5753-478c-9b07-d4b27eb2d3e8\") " pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:01.974127 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974003 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-cni-bin\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.974127 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974017 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b4b5a677-a74e-41a0-9821-6b56e1c0328c-agent-certs\") pod \"konnectivity-agent-vhpgc\" (UID: \"b4b5a677-a74e-41a0-9821-6b56e1c0328c\") " pod="kube-system/konnectivity-agent-vhpgc" Apr 16 18:10:01.974127 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974035 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-run\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.974127 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974051 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-tuned\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.974127 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974066 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-var-lib-openvswitch\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.974127 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974087 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-run-openvswitch\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.974127 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974127 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-run-multus-certs\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974146 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhb9t\" (UniqueName: \"kubernetes.io/projected/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-kube-api-access-bhb9t\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974176 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c28c1588-a1f5-4491-bbfc-135c1d264663-hosts-file\") pod \"node-resolver-hhl4h\" (UID: \"c28c1588-a1f5-4491-bbfc-135c1d264663\") " pod="openshift-dns/node-resolver-hhl4h" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974194 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916-iptables-alerter-script\") pod \"iptables-alerter-4wkgk\" (UID: \"fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916\") " pod="openshift-network-operator/iptables-alerter-4wkgk" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974208 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-slash\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974225 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v87jt\" (UniqueName: \"kubernetes.io/projected/e9da1b91-a9ae-4adf-ac9f-881e7217faad-kube-api-access-v87jt\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974248 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-run-k8s-cni-cncf-io\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974262 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-var-lib-kubelet\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974280 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-hostroot\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974293 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9ckw\" (UniqueName: \"kubernetes.io/projected/fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916-kube-api-access-g9ckw\") pod \"iptables-alerter-4wkgk\" (UID: \"fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916\") " pod="openshift-network-operator/iptables-alerter-4wkgk" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974314 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-kubelet\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974330 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-run-ovn-kubernetes\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974350 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3c85f82-6657-4a18-9364-c3ce61213e8a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974365 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f7de0432-d90d-4397-aa83-1a431f35bfa6-serviceca\") pod \"node-ca-5tctm\" (UID: \"f7de0432-d90d-4397-aa83-1a431f35bfa6\") " pod="openshift-image-registry/node-ca-5tctm" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974381 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-device-dir\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974402 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-sysctl-conf\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974420 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-etc-openvswitch\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.974448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974440 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-run-ovn\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974453 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9da1b91-a9ae-4adf-ac9f-881e7217faad-ovnkube-config\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974476 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-run-netns\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974503 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rghkp\" (UniqueName: \"kubernetes.io/projected/a5fef405-bf92-4991-86fe-f71befb39d59-kube-api-access-rghkp\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974517 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3c85f82-6657-4a18-9364-c3ce61213e8a-cnibin\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974531 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-sys-fs\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974605 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5fef405-bf92-4991-86fe-f71befb39d59-cni-binary-copy\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974625 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs\") pod \"network-metrics-daemon-chzqx\" (UID: \"4eacb341-6891-41dc-a3c0-09b5697178ee\") " pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974647 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-sysconfig\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974667 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-lib-modules\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974683 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-cni-netd\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974699 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-multus-conf-dir\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974724 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a5fef405-bf92-4991-86fe-f71befb39d59-multus-daemon-config\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974738 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-systemd\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974755 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-var-lib-kubelet\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974788 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-systemd-units\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974841 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-run-netns\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.975081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974860 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-node-log\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974885 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9da1b91-a9ae-4adf-ac9f-881e7217faad-env-overrides\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974909 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sml7s\" (UniqueName: \"kubernetes.io/projected/f7de0432-d90d-4397-aa83-1a431f35bfa6-kube-api-access-sml7s\") pod \"node-ca-5tctm\" (UID: \"f7de0432-d90d-4397-aa83-1a431f35bfa6\") " pod="openshift-image-registry/node-ca-5tctm" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974936 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-tmp\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974954 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlbrn\" (UniqueName: \"kubernetes.io/projected/c28c1588-a1f5-4491-bbfc-135c1d264663-kube-api-access-mlbrn\") pod \"node-resolver-hhl4h\" (UID: \"c28c1588-a1f5-4491-bbfc-135c1d264663\") " pod="openshift-dns/node-resolver-hhl4h" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974980 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b4b5a677-a74e-41a0-9821-6b56e1c0328c-konnectivity-ca\") pod \"konnectivity-agent-vhpgc\" (UID: \"b4b5a677-a74e-41a0-9821-6b56e1c0328c\") " pod="kube-system/konnectivity-agent-vhpgc" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.974997 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-modprobe-d\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975012 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7qp2\" (UniqueName: \"kubernetes.io/projected/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-kube-api-access-c7qp2\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975035 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-multus-cni-dir\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975060 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3c85f82-6657-4a18-9364-c3ce61213e8a-system-cni-dir\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975093 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3c85f82-6657-4a18-9364-c3ce61213e8a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975137 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975155 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916-host-slash\") pod \"iptables-alerter-4wkgk\" (UID: \"fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916\") " pod="openshift-network-operator/iptables-alerter-4wkgk" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975174 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-kubernetes\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975191 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-log-socket\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975216 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3c85f82-6657-4a18-9364-c3ce61213e8a-os-release\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975239 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-etc-selinux\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.975713 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975262 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-system-cni-dir\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975280 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-multus-socket-dir-parent\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975311 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-etc-kubernetes\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975335 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-sysctl-d\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975365 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-host\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975380 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e9da1b91-a9ae-4adf-ac9f-881e7217faad-ovnkube-script-lib\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975406 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-cnibin\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975426 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-var-lib-cni-multus\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975452 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79xgw\" (UniqueName: \"kubernetes.io/projected/c3c85f82-6657-4a18-9364-c3ce61213e8a-kube-api-access-79xgw\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975473 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-socket-dir\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975494 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-os-release\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975513 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f7de0432-d90d-4397-aa83-1a431f35bfa6-host\") pod \"node-ca-5tctm\" (UID: \"f7de0432-d90d-4397-aa83-1a431f35bfa6\") " pod="openshift-image-registry/node-ca-5tctm" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975662 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3c85f82-6657-4a18-9364-c3ce61213e8a-system-cni-dir\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975679 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-sys-fs\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975718 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-multus-cni-dir\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975826 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-sysconfig\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975934 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-lib-modules\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.975968 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-cni-netd\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.976533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976017 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-multus-conf-dir\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976334 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5fef405-bf92-4991-86fe-f71befb39d59-cni-binary-copy\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976344 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c3c85f82-6657-4a18-9364-c3ce61213e8a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976431 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-etc-kubernetes\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976434 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976468 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916-host-slash\") pod \"iptables-alerter-4wkgk\" (UID: \"fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916\") " pod="openshift-network-operator/iptables-alerter-4wkgk" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976477 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-registration-dir\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976504 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-kubernetes\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976528 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-log-socket\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976551 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-sysctl-d\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976603 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3c85f82-6657-4a18-9364-c3ce61213e8a-os-release\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976647 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a5fef405-bf92-4991-86fe-f71befb39d59-multus-daemon-config\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976666 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-etc-selinux\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976625 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-host\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976697 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-system-cni-dir\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976728 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-multus-socket-dir-parent\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976731 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-systemd\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976788 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-var-lib-cni-bin\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.977255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976803 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-var-lib-kubelet\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976823 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c28c1588-a1f5-4491-bbfc-135c1d264663-tmp-dir\") pod \"node-resolver-hhl4h\" (UID: \"c28c1588-a1f5-4491-bbfc-135c1d264663\") " pod="openshift-dns/node-resolver-hhl4h" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.976882 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-slash\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977162 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3c85f82-6657-4a18-9364-c3ce61213e8a-cni-binary-copy\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977255 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916-iptables-alerter-script\") pod \"iptables-alerter-4wkgk\" (UID: \"fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916\") " pod="openshift-network-operator/iptables-alerter-4wkgk" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977298 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-run-k8s-cni-cncf-io\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977358 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-var-lib-kubelet\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977400 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-hostroot\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977400 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e9da1b91-a9ae-4adf-ac9f-881e7217faad-ovnkube-script-lib\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977457 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c3c85f82-6657-4a18-9364-c3ce61213e8a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977477 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-cnibin\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977531 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-var-lib-cni-multus\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977595 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977603 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-run-systemd\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.977921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977628 2583 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:10:01.978489 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.978166 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-os-release\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.978489 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977541 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-sys\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.978489 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.978245 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-run-ovn-kubernetes\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.978489 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.978414 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-sysctl-conf\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.978489 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.978486 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-systemd-units\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.978945 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.978922 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3c85f82-6657-4a18-9364-c3ce61213e8a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.979042 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.979006 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-socket-dir\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.979158 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.978668 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-kubelet\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.979353 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.979322 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3c85f82-6657-4a18-9364-c3ce61213e8a-cnibin\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.979474 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.979389 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-run-netns\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.979474 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.979434 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-device-dir\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:01.979773 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.979748 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-run-multus-certs\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.979975 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.979950 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-run\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.980599 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.980441 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e9da1b91-a9ae-4adf-ac9f-881e7217faad-ovnkube-config\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.981180 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.981143 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b4b5a677-a74e-41a0-9821-6b56e1c0328c-konnectivity-ca\") pod \"konnectivity-agent-vhpgc\" (UID: \"b4b5a677-a74e-41a0-9821-6b56e1c0328c\") " pod="kube-system/konnectivity-agent-vhpgc" Apr 16 18:10:01.982721 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.982691 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-etc-openvswitch\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.982831 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.982748 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-run-ovn\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.982909 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.982894 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5fef405-bf92-4991-86fe-f71befb39d59-host-run-netns\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.982964 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.982892 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-tuned\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.983451 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.983425 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e9da1b91-a9ae-4adf-ac9f-881e7217faad-env-overrides\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.983534 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.983381 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c28c1588-a1f5-4491-bbfc-135c1d264663-hosts-file\") pod \"node-resolver-hhl4h\" (UID: \"c28c1588-a1f5-4491-bbfc-135c1d264663\") " pod="openshift-dns/node-resolver-hhl4h" Apr 16 18:10:01.983534 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.983494 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-var-lib-openvswitch\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.983534 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.983518 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b4b5a677-a74e-41a0-9821-6b56e1c0328c-agent-certs\") pod \"konnectivity-agent-vhpgc\" (UID: \"b4b5a677-a74e-41a0-9821-6b56e1c0328c\") " pod="kube-system/konnectivity-agent-vhpgc" Apr 16 18:10:01.983689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.983550 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-run-openvswitch\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.983689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.983560 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-node-log\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.983776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.983725 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-etc-modprobe-d\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.983844 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.977637 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e9da1b91-a9ae-4adf-ac9f-881e7217faad-host-cni-bin\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.986546 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.986512 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9ckw\" (UniqueName: \"kubernetes.io/projected/fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916-kube-api-access-g9ckw\") pod \"iptables-alerter-4wkgk\" (UID: \"fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916\") " pod="openshift-network-operator/iptables-alerter-4wkgk" Apr 16 18:10:01.986705 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.986682 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-tmp\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.987180 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.987147 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v87jt\" (UniqueName: \"kubernetes.io/projected/e9da1b91-a9ae-4adf-ac9f-881e7217faad-kube-api-access-v87jt\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.989481 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.989090 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9da1b91-a9ae-4adf-ac9f-881e7217faad-ovn-node-metrics-cert\") pod \"ovnkube-node-k7ztx\" (UID: \"e9da1b91-a9ae-4adf-ac9f-881e7217faad\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:01.992684 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.992550 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79xgw\" (UniqueName: \"kubernetes.io/projected/c3c85f82-6657-4a18-9364-c3ce61213e8a-kube-api-access-79xgw\") pod \"multus-additional-cni-plugins-cscsn\" (UID: \"c3c85f82-6657-4a18-9364-c3ce61213e8a\") " pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:01.994424 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.994166 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal" event={"ID":"927f711b1ee698a25d6a2103f40d71ed","Type":"ContainerStarted","Data":"ad35bcafdf7abd73a437d101614c530d5bb5598b313bcdf2a20134ab5d552489"} Apr 16 18:10:01.994424 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.994369 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rghkp\" (UniqueName: \"kubernetes.io/projected/a5fef405-bf92-4991-86fe-f71befb39d59-kube-api-access-rghkp\") pod \"multus-qtmlc\" (UID: \"a5fef405-bf92-4991-86fe-f71befb39d59\") " pod="openshift-multus/multus-qtmlc" Apr 16 18:10:01.994562 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.994508 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7qp2\" (UniqueName: \"kubernetes.io/projected/0327926e-d4d2-4b7a-8e66-f69c6adf00f4-kube-api-access-c7qp2\") pod \"tuned-7jqb4\" (UID: \"0327926e-d4d2-4b7a-8e66-f69c6adf00f4\") " pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:01.996863 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.996823 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-95.ec2.internal" event={"ID":"1c631bb54e7f931dcf513f44e89f6bf7","Type":"ContainerStarted","Data":"0f7fca0d4b61eb662fa099e3195494f65f354887e88826ee5b0c0a1934341018"} Apr 16 18:10:01.997395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.997371 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlbrn\" (UniqueName: \"kubernetes.io/projected/c28c1588-a1f5-4491-bbfc-135c1d264663-kube-api-access-mlbrn\") pod \"node-resolver-hhl4h\" (UID: \"c28c1588-a1f5-4491-bbfc-135c1d264663\") " pod="openshift-dns/node-resolver-hhl4h" Apr 16 18:10:01.997544 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:01.997433 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhb9t\" (UniqueName: \"kubernetes.io/projected/a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2-kube-api-access-bhb9t\") pod \"aws-ebs-csi-driver-node-hz66h\" (UID: \"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:02.076695 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.076660 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f7de0432-d90d-4397-aa83-1a431f35bfa6-host\") pod \"node-ca-5tctm\" (UID: \"f7de0432-d90d-4397-aa83-1a431f35bfa6\") " pod="openshift-image-registry/node-ca-5tctm" Apr 16 18:10:02.076695 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.076700 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4n9sb\" (UniqueName: \"kubernetes.io/projected/4eacb341-6891-41dc-a3c0-09b5697178ee-kube-api-access-4n9sb\") pod \"network-metrics-daemon-chzqx\" (UID: \"4eacb341-6891-41dc-a3c0-09b5697178ee\") " pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:02.076945 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.076721 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpkv7\" (UniqueName: \"kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7\") pod \"network-check-target-9rmkx\" (UID: \"d252d242-5753-478c-9b07-d4b27eb2d3e8\") " pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:02.076945 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.076752 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f7de0432-d90d-4397-aa83-1a431f35bfa6-serviceca\") pod \"node-ca-5tctm\" (UID: \"f7de0432-d90d-4397-aa83-1a431f35bfa6\") " pod="openshift-image-registry/node-ca-5tctm" Apr 16 18:10:02.076945 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.076785 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs\") pod \"network-metrics-daemon-chzqx\" (UID: \"4eacb341-6891-41dc-a3c0-09b5697178ee\") " pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:02.076945 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.076798 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f7de0432-d90d-4397-aa83-1a431f35bfa6-host\") pod \"node-ca-5tctm\" (UID: \"f7de0432-d90d-4397-aa83-1a431f35bfa6\") " pod="openshift-image-registry/node-ca-5tctm" Apr 16 18:10:02.076945 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.076816 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sml7s\" (UniqueName: \"kubernetes.io/projected/f7de0432-d90d-4397-aa83-1a431f35bfa6-kube-api-access-sml7s\") pod \"node-ca-5tctm\" (UID: \"f7de0432-d90d-4397-aa83-1a431f35bfa6\") " pod="openshift-image-registry/node-ca-5tctm" Apr 16 18:10:02.077175 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:02.077058 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:02.077175 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:02.077137 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs podName:4eacb341-6891-41dc-a3c0-09b5697178ee nodeName:}" failed. No retries permitted until 2026-04-16 18:10:02.577108555 +0000 UTC m=+3.166306838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs") pod "network-metrics-daemon-chzqx" (UID: "4eacb341-6891-41dc-a3c0-09b5697178ee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:02.077265 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.077230 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f7de0432-d90d-4397-aa83-1a431f35bfa6-serviceca\") pod \"node-ca-5tctm\" (UID: \"f7de0432-d90d-4397-aa83-1a431f35bfa6\") " pod="openshift-image-registry/node-ca-5tctm" Apr 16 18:10:02.085026 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:02.084950 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:02.085026 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:02.084978 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:02.085026 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:02.084993 2583 projected.go:194] Error preparing data for projected volume kube-api-access-qpkv7 for pod openshift-network-diagnostics/network-check-target-9rmkx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:02.085278 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:02.085063 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7 podName:d252d242-5753-478c-9b07-d4b27eb2d3e8 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:02.585041023 +0000 UTC m=+3.174239306 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qpkv7" (UniqueName: "kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7") pod "network-check-target-9rmkx" (UID: "d252d242-5753-478c-9b07-d4b27eb2d3e8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:02.088301 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.088269 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n9sb\" (UniqueName: \"kubernetes.io/projected/4eacb341-6891-41dc-a3c0-09b5697178ee-kube-api-access-4n9sb\") pod \"network-metrics-daemon-chzqx\" (UID: \"4eacb341-6891-41dc-a3c0-09b5697178ee\") " pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:02.088424 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.088316 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sml7s\" (UniqueName: \"kubernetes.io/projected/f7de0432-d90d-4397-aa83-1a431f35bfa6-kube-api-access-sml7s\") pod \"node-ca-5tctm\" (UID: \"f7de0432-d90d-4397-aa83-1a431f35bfa6\") " pod="openshift-image-registry/node-ca-5tctm" Apr 16 18:10:02.162567 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.162526 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cscsn" Apr 16 18:10:02.168110 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.168087 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:02.175695 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.175675 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vhpgc" Apr 16 18:10:02.185514 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.185485 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" Apr 16 18:10:02.200184 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.200161 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" Apr 16 18:10:02.209876 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.209857 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:02.219478 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.219451 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hhl4h" Apr 16 18:10:02.229246 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.229221 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qtmlc" Apr 16 18:10:02.238911 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.238888 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4wkgk" Apr 16 18:10:02.246440 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.246418 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5tctm" Apr 16 18:10:02.344058 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.343969 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:10:02.580927 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.580892 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs\") pod \"network-metrics-daemon-chzqx\" (UID: \"4eacb341-6891-41dc-a3c0-09b5697178ee\") " pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:02.581102 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:02.581006 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:02.581102 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:02.581069 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs podName:4eacb341-6891-41dc-a3c0-09b5697178ee nodeName:}" failed. No retries permitted until 2026-04-16 18:10:03.581048544 +0000 UTC m=+4.170246808 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs") pod "network-metrics-daemon-chzqx" (UID: "4eacb341-6891-41dc-a3c0-09b5697178ee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:02.681224 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.681185 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpkv7\" (UniqueName: \"kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7\") pod \"network-check-target-9rmkx\" (UID: \"d252d242-5753-478c-9b07-d4b27eb2d3e8\") " pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:02.681416 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:02.681369 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:02.681416 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:02.681394 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:02.681416 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:02.681407 2583 projected.go:194] Error preparing data for projected volume kube-api-access-qpkv7 for pod openshift-network-diagnostics/network-check-target-9rmkx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:02.681546 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:02.681465 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7 podName:d252d242-5753-478c-9b07-d4b27eb2d3e8 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:03.681449533 +0000 UTC m=+4.270647798 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qpkv7" (UniqueName: "kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7") pod "network-check-target-9rmkx" (UID: "d252d242-5753-478c-9b07-d4b27eb2d3e8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:02.859479 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:02.859436 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9da1b91_a9ae_4adf_ac9f_881e7217faad.slice/crio-fc58aab721386036ecb01b700a8f56d49bb2d64d2ddc03c17a70e446e3f94c4b WatchSource:0}: Error finding container fc58aab721386036ecb01b700a8f56d49bb2d64d2ddc03c17a70e446e3f94c4b: Status 404 returned error can't find the container with id fc58aab721386036ecb01b700a8f56d49bb2d64d2ddc03c17a70e446e3f94c4b Apr 16 18:10:02.861922 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:02.861895 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3c85f82_6657_4a18_9364_c3ce61213e8a.slice/crio-19812810800b5ba1b3ed928f2ed671174d4d22b2876f27c3f84e775af43a237a WatchSource:0}: Error finding container 19812810800b5ba1b3ed928f2ed671174d4d22b2876f27c3f84e775af43a237a: Status 404 returned error can't find the container with id 19812810800b5ba1b3ed928f2ed671174d4d22b2876f27c3f84e775af43a237a Apr 16 18:10:02.864063 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:02.864043 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7de0432_d90d_4397_aa83_1a431f35bfa6.slice/crio-56834fc0ed4c662230e8048283c72b7949731c0e29806d565782613246d391c6 WatchSource:0}: Error finding container 56834fc0ed4c662230e8048283c72b7949731c0e29806d565782613246d391c6: Status 404 returned error can't find the container with id 56834fc0ed4c662230e8048283c72b7949731c0e29806d565782613246d391c6 Apr 16 18:10:02.865172 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:02.865156 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb7a4ef0_d501_4e24_b5e2_d35a8b4c3916.slice/crio-0a1de785a7aefd090f482904f884026d53b76258654c36f361d3f67501e58b5a WatchSource:0}: Error finding container 0a1de785a7aefd090f482904f884026d53b76258654c36f361d3f67501e58b5a: Status 404 returned error can't find the container with id 0a1de785a7aefd090f482904f884026d53b76258654c36f361d3f67501e58b5a Apr 16 18:10:02.866666 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:02.866429 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4b5a677_a74e_41a0_9821_6b56e1c0328c.slice/crio-9f597cb463961deb838c5b65c7b9f79cc3f95d26ea2d3b94bd9e24b3eb8c751e WatchSource:0}: Error finding container 9f597cb463961deb838c5b65c7b9f79cc3f95d26ea2d3b94bd9e24b3eb8c751e: Status 404 returned error can't find the container with id 9f597cb463961deb838c5b65c7b9f79cc3f95d26ea2d3b94bd9e24b3eb8c751e Apr 16 18:10:02.867506 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:02.867483 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5fef405_bf92_4991_86fe_f71befb39d59.slice/crio-f5e40a81fe30f427c18f8d8952799b72f97f70965e4e2b995736e77f62f785a6 WatchSource:0}: Error finding container f5e40a81fe30f427c18f8d8952799b72f97f70965e4e2b995736e77f62f785a6: Status 404 returned error can't find the container with id f5e40a81fe30f427c18f8d8952799b72f97f70965e4e2b995736e77f62f785a6 Apr 16 18:10:02.868641 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:02.868495 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc28c1588_a1f5_4491_bbfc_135c1d264663.slice/crio-bfa474f2a44bf8e8c7dae1bb6a91c72b91e36ceeef6302ebc653f22f8d772d5e WatchSource:0}: Error finding container bfa474f2a44bf8e8c7dae1bb6a91c72b91e36ceeef6302ebc653f22f8d772d5e: Status 404 returned error can't find the container with id bfa474f2a44bf8e8c7dae1bb6a91c72b91e36ceeef6302ebc653f22f8d772d5e Apr 16 18:10:02.869819 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:02.869718 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0327926e_d4d2_4b7a_8e66_f69c6adf00f4.slice/crio-3688fa1b057e54b9b07df8f38be1c85622c97547bf93bad44231eb5fea3fea69 WatchSource:0}: Error finding container 3688fa1b057e54b9b07df8f38be1c85622c97547bf93bad44231eb5fea3fea69: Status 404 returned error can't find the container with id 3688fa1b057e54b9b07df8f38be1c85622c97547bf93bad44231eb5fea3fea69 Apr 16 18:10:02.871815 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:02.871421 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4c77a14_b26d_4fd9_91aa_dbc7c0b543f2.slice/crio-8bba323befa0a6e637ed048df791498da6990972452292e09aa87131f9d4c340 WatchSource:0}: Error finding container 8bba323befa0a6e637ed048df791498da6990972452292e09aa87131f9d4c340: Status 404 returned error can't find the container with id 8bba323befa0a6e637ed048df791498da6990972452292e09aa87131f9d4c340 Apr 16 18:10:02.920444 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.920406 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:05:00 +0000 UTC" deadline="2027-10-05 11:40:22.023932929 +0000 UTC" Apr 16 18:10:02.920444 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.920440 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12881h30m19.103496002s" Apr 16 18:10:02.999964 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:02.999929 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qtmlc" event={"ID":"a5fef405-bf92-4991-86fe-f71befb39d59","Type":"ContainerStarted","Data":"f5e40a81fe30f427c18f8d8952799b72f97f70965e4e2b995736e77f62f785a6"} Apr 16 18:10:03.000895 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.000868 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vhpgc" event={"ID":"b4b5a677-a74e-41a0-9821-6b56e1c0328c","Type":"ContainerStarted","Data":"9f597cb463961deb838c5b65c7b9f79cc3f95d26ea2d3b94bd9e24b3eb8c751e"} Apr 16 18:10:03.001884 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.001860 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cscsn" event={"ID":"c3c85f82-6657-4a18-9364-c3ce61213e8a","Type":"ContainerStarted","Data":"19812810800b5ba1b3ed928f2ed671174d4d22b2876f27c3f84e775af43a237a"} Apr 16 18:10:03.003381 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.003360 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-95.ec2.internal" event={"ID":"1c631bb54e7f931dcf513f44e89f6bf7","Type":"ContainerStarted","Data":"cb744f93136c6ecbb4f2e02ed895bc0b1ecb0d9a7480bd0514227252446dbbb3"} Apr 16 18:10:03.004417 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.004398 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4wkgk" event={"ID":"fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916","Type":"ContainerStarted","Data":"0a1de785a7aefd090f482904f884026d53b76258654c36f361d3f67501e58b5a"} Apr 16 18:10:03.005355 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.005337 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hhl4h" event={"ID":"c28c1588-a1f5-4491-bbfc-135c1d264663","Type":"ContainerStarted","Data":"bfa474f2a44bf8e8c7dae1bb6a91c72b91e36ceeef6302ebc653f22f8d772d5e"} Apr 16 18:10:03.006276 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.006260 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5tctm" event={"ID":"f7de0432-d90d-4397-aa83-1a431f35bfa6","Type":"ContainerStarted","Data":"56834fc0ed4c662230e8048283c72b7949731c0e29806d565782613246d391c6"} Apr 16 18:10:03.007289 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.007269 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" event={"ID":"e9da1b91-a9ae-4adf-ac9f-881e7217faad","Type":"ContainerStarted","Data":"fc58aab721386036ecb01b700a8f56d49bb2d64d2ddc03c17a70e446e3f94c4b"} Apr 16 18:10:03.008305 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.008267 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" event={"ID":"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2","Type":"ContainerStarted","Data":"8bba323befa0a6e637ed048df791498da6990972452292e09aa87131f9d4c340"} Apr 16 18:10:03.009310 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.009293 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" event={"ID":"0327926e-d4d2-4b7a-8e66-f69c6adf00f4","Type":"ContainerStarted","Data":"3688fa1b057e54b9b07df8f38be1c85622c97547bf93bad44231eb5fea3fea69"} Apr 16 18:10:03.017550 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.017515 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-95.ec2.internal" podStartSLOduration=2.01750506 podStartE2EDuration="2.01750506s" podCreationTimestamp="2026-04-16 18:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:03.017409991 +0000 UTC m=+3.606608267" watchObservedRunningTime="2026-04-16 18:10:03.01750506 +0000 UTC m=+3.606703346" Apr 16 18:10:03.588797 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.588758 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs\") pod \"network-metrics-daemon-chzqx\" (UID: \"4eacb341-6891-41dc-a3c0-09b5697178ee\") " pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:03.589015 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:03.588920 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:03.589015 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:03.588999 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs podName:4eacb341-6891-41dc-a3c0-09b5697178ee nodeName:}" failed. No retries permitted until 2026-04-16 18:10:05.588978644 +0000 UTC m=+6.178176911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs") pod "network-metrics-daemon-chzqx" (UID: "4eacb341-6891-41dc-a3c0-09b5697178ee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:03.690348 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.689695 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpkv7\" (UniqueName: \"kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7\") pod \"network-check-target-9rmkx\" (UID: \"d252d242-5753-478c-9b07-d4b27eb2d3e8\") " pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:03.690348 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:03.689879 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:03.690348 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:03.689899 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:03.690348 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:03.689913 2583 projected.go:194] Error preparing data for projected volume kube-api-access-qpkv7 for pod openshift-network-diagnostics/network-check-target-9rmkx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:03.690348 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:03.689979 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7 podName:d252d242-5753-478c-9b07-d4b27eb2d3e8 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:05.689960852 +0000 UTC m=+6.279159123 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qpkv7" (UniqueName: "kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7") pod "network-check-target-9rmkx" (UID: "d252d242-5753-478c-9b07-d4b27eb2d3e8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:03.993769 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.993739 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:03.994211 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:03.993889 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:03.994211 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:03.993739 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:03.996391 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:03.994356 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:04.021496 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:04.021457 2583 generic.go:358] "Generic (PLEG): container finished" podID="927f711b1ee698a25d6a2103f40d71ed" containerID="e324c7fb7af74b735f3b5a4ebcfaa03ffcd7c214ecd34a4179316a95e25de338" exitCode=0 Apr 16 18:10:04.022646 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:04.022396 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal" event={"ID":"927f711b1ee698a25d6a2103f40d71ed","Type":"ContainerDied","Data":"e324c7fb7af74b735f3b5a4ebcfaa03ffcd7c214ecd34a4179316a95e25de338"} Apr 16 18:10:05.042535 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:05.042493 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal" event={"ID":"927f711b1ee698a25d6a2103f40d71ed","Type":"ContainerStarted","Data":"a66b6349a3101980abbedd417f7de7c55ec4f5b66eb70d890a688cd4b0c9665c"} Apr 16 18:10:05.607131 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:05.607091 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs\") pod \"network-metrics-daemon-chzqx\" (UID: \"4eacb341-6891-41dc-a3c0-09b5697178ee\") " pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:05.607331 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:05.607295 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:05.607401 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:05.607362 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs podName:4eacb341-6891-41dc-a3c0-09b5697178ee nodeName:}" failed. No retries permitted until 2026-04-16 18:10:09.607341412 +0000 UTC m=+10.196539696 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs") pod "network-metrics-daemon-chzqx" (UID: "4eacb341-6891-41dc-a3c0-09b5697178ee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:05.708209 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:05.708171 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpkv7\" (UniqueName: \"kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7\") pod \"network-check-target-9rmkx\" (UID: \"d252d242-5753-478c-9b07-d4b27eb2d3e8\") " pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:05.708408 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:05.708390 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:05.708486 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:05.708416 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:05.708486 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:05.708429 2583 projected.go:194] Error preparing data for projected volume kube-api-access-qpkv7 for pod openshift-network-diagnostics/network-check-target-9rmkx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:05.708620 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:05.708490 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7 podName:d252d242-5753-478c-9b07-d4b27eb2d3e8 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:09.708471548 +0000 UTC m=+10.297669824 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qpkv7" (UniqueName: "kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7") pod "network-check-target-9rmkx" (UID: "d252d242-5753-478c-9b07-d4b27eb2d3e8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:05.988613 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:05.987595 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:05.988613 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:05.987732 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:05.988613 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:05.988191 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:05.988613 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:05.988276 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:07.989942 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:07.989909 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:07.990376 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:07.990040 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:07.990439 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:07.990418 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:07.990540 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:07.990512 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:09.641143 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:09.641103 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs\") pod \"network-metrics-daemon-chzqx\" (UID: \"4eacb341-6891-41dc-a3c0-09b5697178ee\") " pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:09.641763 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:09.641279 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:09.641763 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:09.641329 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs podName:4eacb341-6891-41dc-a3c0-09b5697178ee nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.641316464 +0000 UTC m=+18.230514732 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs") pod "network-metrics-daemon-chzqx" (UID: "4eacb341-6891-41dc-a3c0-09b5697178ee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:09.742662 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:09.742070 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpkv7\" (UniqueName: \"kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7\") pod \"network-check-target-9rmkx\" (UID: \"d252d242-5753-478c-9b07-d4b27eb2d3e8\") " pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:09.742662 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:09.742206 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:09.742662 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:09.742227 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:09.742662 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:09.742241 2583 projected.go:194] Error preparing data for projected volume kube-api-access-qpkv7 for pod openshift-network-diagnostics/network-check-target-9rmkx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:09.742662 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:09.742301 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7 podName:d252d242-5753-478c-9b07-d4b27eb2d3e8 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:17.742282182 +0000 UTC m=+18.331480464 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qpkv7" (UniqueName: "kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7") pod "network-check-target-9rmkx" (UID: "d252d242-5753-478c-9b07-d4b27eb2d3e8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:09.990113 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:09.989564 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:09.990113 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:09.989710 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:09.990782 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:09.990749 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:09.990910 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:09.990848 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:11.989609 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:11.987274 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:11.989609 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:11.987408 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:11.989609 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:11.987503 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:11.989609 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:11.987571 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:13.987800 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:13.987760 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:13.988326 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:13.987808 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:13.988326 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:13.987914 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:13.988326 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:13.988063 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:15.990450 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:15.990424 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:15.990450 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:15.990436 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:15.990952 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:15.990537 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:15.990952 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:15.990700 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:17.701834 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:17.701799 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs\") pod \"network-metrics-daemon-chzqx\" (UID: \"4eacb341-6891-41dc-a3c0-09b5697178ee\") " pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:17.702351 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:17.701926 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:17.702351 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:17.701985 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs podName:4eacb341-6891-41dc-a3c0-09b5697178ee nodeName:}" failed. No retries permitted until 2026-04-16 18:10:33.7019685 +0000 UTC m=+34.291166767 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs") pod "network-metrics-daemon-chzqx" (UID: "4eacb341-6891-41dc-a3c0-09b5697178ee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:17.802320 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:17.802284 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpkv7\" (UniqueName: \"kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7\") pod \"network-check-target-9rmkx\" (UID: \"d252d242-5753-478c-9b07-d4b27eb2d3e8\") " pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:17.802485 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:17.802458 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:17.802485 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:17.802481 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:17.802570 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:17.802495 2583 projected.go:194] Error preparing data for projected volume kube-api-access-qpkv7 for pod openshift-network-diagnostics/network-check-target-9rmkx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:17.802570 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:17.802553 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7 podName:d252d242-5753-478c-9b07-d4b27eb2d3e8 nodeName:}" failed. No retries permitted until 2026-04-16 18:10:33.802534855 +0000 UTC m=+34.391733121 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qpkv7" (UniqueName: "kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7") pod "network-check-target-9rmkx" (UID: "d252d242-5753-478c-9b07-d4b27eb2d3e8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:17.987556 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:17.987472 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:17.987717 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:17.987472 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:17.987717 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:17.987636 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:17.987717 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:17.987691 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:19.987809 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:19.987781 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:19.988206 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:19.987885 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:19.988206 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:19.987920 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:19.988206 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:19.988011 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:21.072879 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.072632 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vhpgc" event={"ID":"b4b5a677-a74e-41a0-9821-6b56e1c0328c","Type":"ContainerStarted","Data":"345f5ec386650f9d2f854ee95f66fe73910875f1be92fb7ea623d59f6a72161e"} Apr 16 18:10:21.074001 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.073975 2583 generic.go:358] "Generic (PLEG): container finished" podID="c3c85f82-6657-4a18-9364-c3ce61213e8a" containerID="d9090e8919b7760e72be0a20aa66ce6d9d9a3b576aec9e333321d98b17bd07bb" exitCode=0 Apr 16 18:10:21.074147 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.074041 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cscsn" event={"ID":"c3c85f82-6657-4a18-9364-c3ce61213e8a","Type":"ContainerDied","Data":"d9090e8919b7760e72be0a20aa66ce6d9d9a3b576aec9e333321d98b17bd07bb"} Apr 16 18:10:21.075536 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.075457 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hhl4h" event={"ID":"c28c1588-a1f5-4491-bbfc-135c1d264663","Type":"ContainerStarted","Data":"da5dc8325185417daff494ff9485c35faa66950d822de84fe1c5fde20742f7db"} Apr 16 18:10:21.076952 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.076921 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5tctm" event={"ID":"f7de0432-d90d-4397-aa83-1a431f35bfa6","Type":"ContainerStarted","Data":"61bd58a3c8dbffe64331cfd248f90274fc69da98e986764e235abef0f3352ee0"} Apr 16 18:10:21.078745 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.078727 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:10:21.079035 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.079015 2583 generic.go:358] "Generic (PLEG): container finished" podID="e9da1b91-a9ae-4adf-ac9f-881e7217faad" containerID="1df041808b76c5648fa93dab2851607623491dfadcf79480a16898a3033ad516" exitCode=1 Apr 16 18:10:21.079091 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.079077 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" event={"ID":"e9da1b91-a9ae-4adf-ac9f-881e7217faad","Type":"ContainerStarted","Data":"37bea0d5be41ec99efa04b8610dcf590d98e99532814ca2539144be5a0dad760"} Apr 16 18:10:21.079131 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.079096 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" event={"ID":"e9da1b91-a9ae-4adf-ac9f-881e7217faad","Type":"ContainerStarted","Data":"a8b58d316c60d164800195a4c5a9ea30e277509791b4ed9f308a6721bcc13c38"} Apr 16 18:10:21.079131 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.079106 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" event={"ID":"e9da1b91-a9ae-4adf-ac9f-881e7217faad","Type":"ContainerDied","Data":"1df041808b76c5648fa93dab2851607623491dfadcf79480a16898a3033ad516"} Apr 16 18:10:21.079131 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.079115 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" event={"ID":"e9da1b91-a9ae-4adf-ac9f-881e7217faad","Type":"ContainerStarted","Data":"b34c08ec2cbc165cdd2830f6ee40ad931b27313b3287da9218c65fdd5db875b1"} Apr 16 18:10:21.080304 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.080282 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" event={"ID":"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2","Type":"ContainerStarted","Data":"2c9989fbd4d3b01d9d9ba123c26fe38ea5352a5681a5fc7d208ed13e1a6f1dc4"} Apr 16 18:10:21.081542 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.081518 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" event={"ID":"0327926e-d4d2-4b7a-8e66-f69c6adf00f4","Type":"ContainerStarted","Data":"62c9aab24283d969018b9c23bc969ecfad7f7762cd2921d1b7edc6b3aad5ef31"} Apr 16 18:10:21.082779 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.082759 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qtmlc" event={"ID":"a5fef405-bf92-4991-86fe-f71befb39d59","Type":"ContainerStarted","Data":"82761ac494d25a8167c7755a043f6a1388ab72b593566844738aba869f6f2af4"} Apr 16 18:10:21.085570 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.085535 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vhpgc" podStartSLOduration=3.990374791 podStartE2EDuration="21.085525778s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.868412989 +0000 UTC m=+3.457611253" lastFinishedPulling="2026-04-16 18:10:19.963563976 +0000 UTC m=+20.552762240" observedRunningTime="2026-04-16 18:10:21.085264185 +0000 UTC m=+21.674462471" watchObservedRunningTime="2026-04-16 18:10:21.085525778 +0000 UTC m=+21.674724065" Apr 16 18:10:21.085794 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.085773 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-95.ec2.internal" podStartSLOduration=20.085766111 podStartE2EDuration="20.085766111s" podCreationTimestamp="2026-04-16 18:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:05.057731878 +0000 UTC m=+5.646930165" watchObservedRunningTime="2026-04-16 18:10:21.085766111 +0000 UTC m=+21.674964464" Apr 16 18:10:21.101395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.101351 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7jqb4" podStartSLOduration=4.028261202 podStartE2EDuration="21.101338788s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.87553649 +0000 UTC m=+3.464734753" lastFinishedPulling="2026-04-16 18:10:19.948614074 +0000 UTC m=+20.537812339" observedRunningTime="2026-04-16 18:10:21.100837157 +0000 UTC m=+21.690035462" watchObservedRunningTime="2026-04-16 18:10:21.101338788 +0000 UTC m=+21.690537074" Apr 16 18:10:21.114871 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.114832 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hhl4h" podStartSLOduration=4.037414245 podStartE2EDuration="21.114818995s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.871245313 +0000 UTC m=+3.460443583" lastFinishedPulling="2026-04-16 18:10:19.948650064 +0000 UTC m=+20.537848333" observedRunningTime="2026-04-16 18:10:21.114253971 +0000 UTC m=+21.703452276" watchObservedRunningTime="2026-04-16 18:10:21.114818995 +0000 UTC m=+21.704017280" Apr 16 18:10:21.129088 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.129048 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5tctm" podStartSLOduration=4.046235641 podStartE2EDuration="21.129036084s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.865752762 +0000 UTC m=+3.454951040" lastFinishedPulling="2026-04-16 18:10:19.948553218 +0000 UTC m=+20.537751483" observedRunningTime="2026-04-16 18:10:21.128570198 +0000 UTC m=+21.717768503" watchObservedRunningTime="2026-04-16 18:10:21.129036084 +0000 UTC m=+21.718234370" Apr 16 18:10:21.164714 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.164664 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qtmlc" podStartSLOduration=4.047368802 podStartE2EDuration="21.164648386s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.870157304 +0000 UTC m=+3.459355571" lastFinishedPulling="2026-04-16 18:10:19.987436884 +0000 UTC m=+20.576635155" observedRunningTime="2026-04-16 18:10:21.16450577 +0000 UTC m=+21.753704067" watchObservedRunningTime="2026-04-16 18:10:21.164648386 +0000 UTC m=+21.753846671" Apr 16 18:10:21.614255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.614217 2583 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:10:21.940491 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.940360 2583 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:10:21.614237965Z","UUID":"3d0fc7ee-9493-4ade-a6c7-81b953761e5c","Handler":null,"Name":"","Endpoint":""} Apr 16 18:10:21.943771 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.943745 2583 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:10:21.943902 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.943779 2583 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:10:21.987485 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.987454 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:21.987688 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:21.987461 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:21.987688 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:21.987607 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:21.987688 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:21.987671 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:22.086568 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:22.086518 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4wkgk" event={"ID":"fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916","Type":"ContainerStarted","Data":"6dbfb501da9fe0404bd714981163e399035461ee64ea186385952907f9c43cef"} Apr 16 18:10:22.091080 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:22.091051 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:10:22.091510 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:22.091481 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" event={"ID":"e9da1b91-a9ae-4adf-ac9f-881e7217faad","Type":"ContainerStarted","Data":"81052a372db167b0fd7e6237ca533b81f3c9e0c23a13b937a7bda7f86210ff9c"} Apr 16 18:10:22.091616 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:22.091524 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" event={"ID":"e9da1b91-a9ae-4adf-ac9f-881e7217faad","Type":"ContainerStarted","Data":"32b8a9d8da8b82de1648210eefe7c7b2e38459faffb39955ba509e73ce54011b"} Apr 16 18:10:22.093365 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:22.093336 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" event={"ID":"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2","Type":"ContainerStarted","Data":"825186a14890688ab23ed560c894577ca4b5f24975299c9e8ee50e514c3e5bdd"} Apr 16 18:10:22.105135 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:22.105084 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4wkgk" podStartSLOduration=5.002732877 podStartE2EDuration="22.105049449s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.866749306 +0000 UTC m=+3.455947571" lastFinishedPulling="2026-04-16 18:10:19.969065872 +0000 UTC m=+20.558264143" observedRunningTime="2026-04-16 18:10:22.104458639 +0000 UTC m=+22.693656926" watchObservedRunningTime="2026-04-16 18:10:22.105049449 +0000 UTC m=+22.694247734" Apr 16 18:10:23.097907 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:23.097808 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" event={"ID":"a4c77a14-b26d-4fd9-91aa-dbc7c0b543f2","Type":"ContainerStarted","Data":"f09c09ba9da20903fbe2d8e6feb74875d24a55539eeec9907f7280ceddb604ec"} Apr 16 18:10:23.115334 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:23.115285 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hz66h" podStartSLOduration=3.302405023 podStartE2EDuration="23.115267317s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.875743617 +0000 UTC m=+3.464941884" lastFinishedPulling="2026-04-16 18:10:22.688605914 +0000 UTC m=+23.277804178" observedRunningTime="2026-04-16 18:10:23.114896663 +0000 UTC m=+23.704094951" watchObservedRunningTime="2026-04-16 18:10:23.115267317 +0000 UTC m=+23.704465646" Apr 16 18:10:23.987290 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:23.987258 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:23.987474 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:23.987315 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:23.987474 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:23.987404 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:23.987610 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:23.987535 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:24.103311 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:24.103285 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:10:24.103779 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:24.103713 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" event={"ID":"e9da1b91-a9ae-4adf-ac9f-881e7217faad","Type":"ContainerStarted","Data":"c3df0bfe4c4c795dda2966beab2b19accf9ae093456d21a17fc8b399986d7ff9"} Apr 16 18:10:24.610295 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:24.610260 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vhpgc" Apr 16 18:10:24.611014 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:24.610993 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vhpgc" Apr 16 18:10:25.987637 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:25.987448 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:25.988304 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:25.987449 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:25.988304 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:25.987722 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:25.988304 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:25.987804 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:26.112047 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:26.112014 2583 generic.go:358] "Generic (PLEG): container finished" podID="c3c85f82-6657-4a18-9364-c3ce61213e8a" containerID="504184dddebd7a5eae872cc1765c27a9d003be3aead696bddd48d2f5801f0d24" exitCode=0 Apr 16 18:10:26.112190 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:26.112097 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cscsn" event={"ID":"c3c85f82-6657-4a18-9364-c3ce61213e8a","Type":"ContainerDied","Data":"504184dddebd7a5eae872cc1765c27a9d003be3aead696bddd48d2f5801f0d24"} Apr 16 18:10:26.115208 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:26.115187 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:10:26.115499 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:26.115476 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" event={"ID":"e9da1b91-a9ae-4adf-ac9f-881e7217faad","Type":"ContainerStarted","Data":"4a442a4978b80e6753511d0213ece8fb23cdd2e23bbcf3b52a7214ee89cf776b"} Apr 16 18:10:26.115836 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:26.115787 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:26.115836 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:26.115827 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:26.115965 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:26.115953 2583 scope.go:117] "RemoveContainer" containerID="1df041808b76c5648fa93dab2851607623491dfadcf79480a16898a3033ad516" Apr 16 18:10:26.132236 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:26.132217 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:26.279966 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:26.279883 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vhpgc" Apr 16 18:10:26.280112 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:26.279998 2583 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:10:26.280411 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:26.280393 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vhpgc" Apr 16 18:10:27.121008 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:27.120950 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:10:27.121424 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:27.121305 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" event={"ID":"e9da1b91-a9ae-4adf-ac9f-881e7217faad","Type":"ContainerStarted","Data":"c234d15bb6d177347acf8763c3ddff2dd6892dd0620a8c24f57250f17f257de1"} Apr 16 18:10:27.121506 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:27.121487 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:27.123428 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:27.123406 2583 generic.go:358] "Generic (PLEG): container finished" podID="c3c85f82-6657-4a18-9364-c3ce61213e8a" containerID="dd4c95f22d9ff9a299a6a821fc8e4f16f9d32ecad0876a254e2177a1217bb625" exitCode=0 Apr 16 18:10:27.123548 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:27.123472 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cscsn" event={"ID":"c3c85f82-6657-4a18-9364-c3ce61213e8a","Type":"ContainerDied","Data":"dd4c95f22d9ff9a299a6a821fc8e4f16f9d32ecad0876a254e2177a1217bb625"} Apr 16 18:10:27.136937 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:27.136916 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:27.150503 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:27.150463 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" podStartSLOduration=9.683287847 podStartE2EDuration="27.150450346s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.861136818 +0000 UTC m=+3.450335082" lastFinishedPulling="2026-04-16 18:10:20.328299317 +0000 UTC m=+20.917497581" observedRunningTime="2026-04-16 18:10:27.149692357 +0000 UTC m=+27.738890644" watchObservedRunningTime="2026-04-16 18:10:27.150450346 +0000 UTC m=+27.739648697" Apr 16 18:10:27.540554 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:27.540128 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hhl4h_c28c1588-a1f5-4491-bbfc-135c1d264663/dns-node-resolver/0.log" Apr 16 18:10:27.546663 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:27.546637 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-chzqx"] Apr 16 18:10:27.547263 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:27.547229 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:27.547372 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:27.547322 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9rmkx"] Apr 16 18:10:27.547372 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:27.547358 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:27.547468 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:27.547436 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:27.547563 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:27.547522 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:28.128314 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:28.128143 2583 generic.go:358] "Generic (PLEG): container finished" podID="c3c85f82-6657-4a18-9364-c3ce61213e8a" containerID="18d22305f09119ca84f74c12a12afa53e9242d34f8ffb92268767d23b94e7db9" exitCode=0 Apr 16 18:10:28.128702 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:28.128212 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cscsn" event={"ID":"c3c85f82-6657-4a18-9364-c3ce61213e8a","Type":"ContainerDied","Data":"18d22305f09119ca84f74c12a12afa53e9242d34f8ffb92268767d23b94e7db9"} Apr 16 18:10:28.526482 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:28.526446 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5tctm_f7de0432-d90d-4397-aa83-1a431f35bfa6/node-ca/0.log" Apr 16 18:10:28.987335 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:28.987303 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:28.987335 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:28.987318 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:28.987546 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:28.987418 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:28.987632 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:28.987605 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:30.987921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:30.987887 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:30.988568 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:30.988008 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:30.988568 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:30.988062 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:30.988568 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:30.988151 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:32.987114 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:32.987075 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:32.987765 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:32.987073 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:32.987765 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:32.987214 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:32.987765 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:32.987287 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:33.720813 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:33.720779 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs\") pod \"network-metrics-daemon-chzqx\" (UID: \"4eacb341-6891-41dc-a3c0-09b5697178ee\") " pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:33.720977 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:33.720952 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:33.721055 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:33.721043 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs podName:4eacb341-6891-41dc-a3c0-09b5697178ee nodeName:}" failed. No retries permitted until 2026-04-16 18:11:05.721023049 +0000 UTC m=+66.310221336 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs") pod "network-metrics-daemon-chzqx" (UID: "4eacb341-6891-41dc-a3c0-09b5697178ee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:10:33.821141 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:33.821106 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpkv7\" (UniqueName: \"kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7\") pod \"network-check-target-9rmkx\" (UID: \"d252d242-5753-478c-9b07-d4b27eb2d3e8\") " pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:33.821320 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:33.821277 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:10:33.821320 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:33.821299 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:10:33.821320 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:33.821311 2583 projected.go:194] Error preparing data for projected volume kube-api-access-qpkv7 for pod openshift-network-diagnostics/network-check-target-9rmkx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:33.821440 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:33.821369 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7 podName:d252d242-5753-478c-9b07-d4b27eb2d3e8 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:05.821354592 +0000 UTC m=+66.410552856 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-qpkv7" (UniqueName: "kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7") pod "network-check-target-9rmkx" (UID: "d252d242-5753-478c-9b07-d4b27eb2d3e8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:10:34.987899 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:34.987860 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:34.988384 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:34.987858 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:34.988384 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:34.987982 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:34.988384 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:34.988039 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:35.144074 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:35.144040 2583 generic.go:358] "Generic (PLEG): container finished" podID="c3c85f82-6657-4a18-9364-c3ce61213e8a" containerID="abb75d07e107b720c0c6fd3047c0b6fb515286026390a4964a4886506be27cdc" exitCode=0 Apr 16 18:10:35.144229 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:35.144101 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cscsn" event={"ID":"c3c85f82-6657-4a18-9364-c3ce61213e8a","Type":"ContainerDied","Data":"abb75d07e107b720c0c6fd3047c0b6fb515286026390a4964a4886506be27cdc"} Apr 16 18:10:36.149156 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:36.149125 2583 generic.go:358] "Generic (PLEG): container finished" podID="c3c85f82-6657-4a18-9364-c3ce61213e8a" containerID="919e4740f6841b3247473a05ac11037a2728d71fdbdf2927b34e6eee3a90d098" exitCode=0 Apr 16 18:10:36.149533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:36.149191 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cscsn" event={"ID":"c3c85f82-6657-4a18-9364-c3ce61213e8a","Type":"ContainerDied","Data":"919e4740f6841b3247473a05ac11037a2728d71fdbdf2927b34e6eee3a90d098"} Apr 16 18:10:36.987644 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:36.987449 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:36.987810 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:36.987500 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:36.987810 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:36.987725 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:36.987810 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:36.987791 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:37.153350 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:37.153320 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cscsn" event={"ID":"c3c85f82-6657-4a18-9364-c3ce61213e8a","Type":"ContainerStarted","Data":"831a4867b24df4bee3987112c5118893508727c019b5c790456cfc954403b546"} Apr 16 18:10:37.177467 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:37.177416 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cscsn" podStartSLOduration=5.991517644 podStartE2EDuration="37.17739763s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:10:02.86442501 +0000 UTC m=+3.453623274" lastFinishedPulling="2026-04-16 18:10:34.050304995 +0000 UTC m=+34.639503260" observedRunningTime="2026-04-16 18:10:37.177094137 +0000 UTC m=+37.766292422" watchObservedRunningTime="2026-04-16 18:10:37.17739763 +0000 UTC m=+37.766596117" Apr 16 18:10:38.987147 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:38.987107 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:38.987644 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:38.987107 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:38.987644 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:38.987217 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:38.987644 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:38.987340 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:40.987021 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:40.986984 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:40.987483 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:40.986996 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:40.987483 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:40.987107 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:40.987483 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:40.987168 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:42.987797 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:42.987762 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:42.988213 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:42.987765 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:42.988213 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:42.987949 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:42.988213 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:42.987859 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:44.987134 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:44.987098 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:44.987614 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:44.987098 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:44.987614 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:44.987216 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:44.987614 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:44.987296 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:46.987489 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:46.987450 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:46.987972 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:46.987450 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:46.987972 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:46.987553 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:46.987972 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:46.987697 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:48.987431 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:48.987393 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:48.987905 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:48.987402 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:48.987905 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:48.987490 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:48.987905 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:48.987609 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:50.987255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:50.987218 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:50.987664 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:50.987223 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:50.987664 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:50.987333 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:50.987664 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:50.987412 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:52.987813 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:52.987782 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:52.988253 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:52.987793 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:52.988253 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:52.987904 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9rmkx" podUID="d252d242-5753-478c-9b07-d4b27eb2d3e8" Apr 16 18:10:52.988253 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:10:52.987957 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-chzqx" podUID="4eacb341-6891-41dc-a3c0-09b5697178ee" Apr 16 18:10:53.224749 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.224714 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-95.ec2.internal" event="NodeReady" Apr 16 18:10:53.224915 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.224840 2583 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:10:53.269071 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.269030 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zkgtz"] Apr 16 18:10:53.314931 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.314901 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-d9r4d"] Apr 16 18:10:53.315089 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.315059 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.317918 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.317897 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:10:53.318263 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.318248 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xkg79\"" Apr 16 18:10:53.318321 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.318263 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:10:53.326180 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.326158 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zkgtz"] Apr 16 18:10:53.326180 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.326182 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d9r4d"] Apr 16 18:10:53.326304 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.326193 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r62m9"] Apr 16 18:10:53.326304 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.326297 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.328692 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.328667 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:10:53.328692 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.328679 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:10:53.328845 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.328717 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:10:53.328845 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.328753 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bxtnw\"" Apr 16 18:10:53.328845 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.328715 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:10:53.350714 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.350688 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r62m9"] Apr 16 18:10:53.350865 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.350818 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r62m9" Apr 16 18:10:53.353096 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.353077 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:10:53.353320 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.353301 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7tj5h\"" Apr 16 18:10:53.353320 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.353317 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:10:53.353441 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.353306 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:10:53.455402 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.455366 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgknv\" (UniqueName: \"kubernetes.io/projected/f60a69fc-3609-441f-9761-47098d24b5d0-kube-api-access-jgknv\") pod \"dns-default-zkgtz\" (UID: \"f60a69fc-3609-441f-9761-47098d24b5d0\") " pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.455402 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.455407 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/22d015bf-6f4c-43a3-9b78-b7db19a716eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.455683 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.455426 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dvsn\" (UniqueName: \"kubernetes.io/projected/22d015bf-6f4c-43a3-9b78-b7db19a716eb-kube-api-access-6dvsn\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.455683 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.455529 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f60a69fc-3609-441f-9761-47098d24b5d0-tmp-dir\") pod \"dns-default-zkgtz\" (UID: \"f60a69fc-3609-441f-9761-47098d24b5d0\") " pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.455683 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.455561 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f60a69fc-3609-441f-9761-47098d24b5d0-config-volume\") pod \"dns-default-zkgtz\" (UID: \"f60a69fc-3609-441f-9761-47098d24b5d0\") " pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.455683 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.455612 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dda202b0-970a-4797-9f1d-010604ebe152-cert\") pod \"ingress-canary-r62m9\" (UID: \"dda202b0-970a-4797-9f1d-010604ebe152\") " pod="openshift-ingress-canary/ingress-canary-r62m9" Apr 16 18:10:53.455683 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.455635 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/22d015bf-6f4c-43a3-9b78-b7db19a716eb-data-volume\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.455683 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.455654 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/22d015bf-6f4c-43a3-9b78-b7db19a716eb-crio-socket\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.455683 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.455680 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f60a69fc-3609-441f-9761-47098d24b5d0-metrics-tls\") pod \"dns-default-zkgtz\" (UID: \"f60a69fc-3609-441f-9761-47098d24b5d0\") " pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.455903 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.455710 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2f9v\" (UniqueName: \"kubernetes.io/projected/dda202b0-970a-4797-9f1d-010604ebe152-kube-api-access-r2f9v\") pod \"ingress-canary-r62m9\" (UID: \"dda202b0-970a-4797-9f1d-010604ebe152\") " pod="openshift-ingress-canary/ingress-canary-r62m9" Apr 16 18:10:53.455903 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.455736 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/22d015bf-6f4c-43a3-9b78-b7db19a716eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.556332 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.556244 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/22d015bf-6f4c-43a3-9b78-b7db19a716eb-data-volume\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.556332 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.556281 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/22d015bf-6f4c-43a3-9b78-b7db19a716eb-crio-socket\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.556332 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.556324 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f60a69fc-3609-441f-9761-47098d24b5d0-metrics-tls\") pod \"dns-default-zkgtz\" (UID: \"f60a69fc-3609-441f-9761-47098d24b5d0\") " pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.556620 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.556349 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2f9v\" (UniqueName: \"kubernetes.io/projected/dda202b0-970a-4797-9f1d-010604ebe152-kube-api-access-r2f9v\") pod \"ingress-canary-r62m9\" (UID: \"dda202b0-970a-4797-9f1d-010604ebe152\") " pod="openshift-ingress-canary/ingress-canary-r62m9" Apr 16 18:10:53.556620 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.556373 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/22d015bf-6f4c-43a3-9b78-b7db19a716eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.556620 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.556397 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/22d015bf-6f4c-43a3-9b78-b7db19a716eb-crio-socket\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.556620 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.556409 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgknv\" (UniqueName: \"kubernetes.io/projected/f60a69fc-3609-441f-9761-47098d24b5d0-kube-api-access-jgknv\") pod \"dns-default-zkgtz\" (UID: \"f60a69fc-3609-441f-9761-47098d24b5d0\") " pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.556620 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.556435 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/22d015bf-6f4c-43a3-9b78-b7db19a716eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.556620 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.556470 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dvsn\" (UniqueName: \"kubernetes.io/projected/22d015bf-6f4c-43a3-9b78-b7db19a716eb-kube-api-access-6dvsn\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.556620 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.556516 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f60a69fc-3609-441f-9761-47098d24b5d0-tmp-dir\") pod \"dns-default-zkgtz\" (UID: \"f60a69fc-3609-441f-9761-47098d24b5d0\") " pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.556620 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.556570 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f60a69fc-3609-441f-9761-47098d24b5d0-config-volume\") pod \"dns-default-zkgtz\" (UID: \"f60a69fc-3609-441f-9761-47098d24b5d0\") " pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.556985 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.556650 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dda202b0-970a-4797-9f1d-010604ebe152-cert\") pod \"ingress-canary-r62m9\" (UID: \"dda202b0-970a-4797-9f1d-010604ebe152\") " pod="openshift-ingress-canary/ingress-canary-r62m9" Apr 16 18:10:53.556985 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.556694 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/22d015bf-6f4c-43a3-9b78-b7db19a716eb-data-volume\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.557098 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.557079 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/22d015bf-6f4c-43a3-9b78-b7db19a716eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.560221 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.560191 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/22d015bf-6f4c-43a3-9b78-b7db19a716eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.560414 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.560394 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dda202b0-970a-4797-9f1d-010604ebe152-cert\") pod \"ingress-canary-r62m9\" (UID: \"dda202b0-970a-4797-9f1d-010604ebe152\") " pod="openshift-ingress-canary/ingress-canary-r62m9" Apr 16 18:10:53.564660 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.564627 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dvsn\" (UniqueName: \"kubernetes.io/projected/22d015bf-6f4c-43a3-9b78-b7db19a716eb-kube-api-access-6dvsn\") pod \"insights-runtime-extractor-d9r4d\" (UID: \"22d015bf-6f4c-43a3-9b78-b7db19a716eb\") " pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.567160 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.567127 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f60a69fc-3609-441f-9761-47098d24b5d0-tmp-dir\") pod \"dns-default-zkgtz\" (UID: \"f60a69fc-3609-441f-9761-47098d24b5d0\") " pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.567329 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.567311 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f60a69fc-3609-441f-9761-47098d24b5d0-metrics-tls\") pod \"dns-default-zkgtz\" (UID: \"f60a69fc-3609-441f-9761-47098d24b5d0\") " pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.567463 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.567446 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgknv\" (UniqueName: \"kubernetes.io/projected/f60a69fc-3609-441f-9761-47098d24b5d0-kube-api-access-jgknv\") pod \"dns-default-zkgtz\" (UID: \"f60a69fc-3609-441f-9761-47098d24b5d0\") " pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.577622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.577571 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f60a69fc-3609-441f-9761-47098d24b5d0-config-volume\") pod \"dns-default-zkgtz\" (UID: \"f60a69fc-3609-441f-9761-47098d24b5d0\") " pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.577701 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.577669 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2f9v\" (UniqueName: \"kubernetes.io/projected/dda202b0-970a-4797-9f1d-010604ebe152-kube-api-access-r2f9v\") pod \"ingress-canary-r62m9\" (UID: \"dda202b0-970a-4797-9f1d-010604ebe152\") " pod="openshift-ingress-canary/ingress-canary-r62m9" Apr 16 18:10:53.624244 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.624217 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:53.634984 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.634961 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d9r4d" Apr 16 18:10:53.672023 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.671994 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r62m9" Apr 16 18:10:53.814014 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.813923 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zkgtz"] Apr 16 18:10:53.818168 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:53.818121 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf60a69fc_3609_441f_9761_47098d24b5d0.slice/crio-5e34e25d4561c3981442ee5db3e6cc828e8cbfe9d3a8218afac1e37fcf27e2da WatchSource:0}: Error finding container 5e34e25d4561c3981442ee5db3e6cc828e8cbfe9d3a8218afac1e37fcf27e2da: Status 404 returned error can't find the container with id 5e34e25d4561c3981442ee5db3e6cc828e8cbfe9d3a8218afac1e37fcf27e2da Apr 16 18:10:53.818542 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.818519 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d9r4d"] Apr 16 18:10:53.822471 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:53.822448 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22d015bf_6f4c_43a3_9b78_b7db19a716eb.slice/crio-9ed358bacc62fee9a2a2586383d7c03f73ae9cafd24769d366cfd82431e2f148 WatchSource:0}: Error finding container 9ed358bacc62fee9a2a2586383d7c03f73ae9cafd24769d366cfd82431e2f148: Status 404 returned error can't find the container with id 9ed358bacc62fee9a2a2586383d7c03f73ae9cafd24769d366cfd82431e2f148 Apr 16 18:10:53.842364 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:53.842338 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r62m9"] Apr 16 18:10:53.845400 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:53.845375 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddda202b0_970a_4797_9f1d_010604ebe152.slice/crio-516462a2c2df2a07fdeebd5df3bfc8e877f30d5269481ffcccda04aad4beb63d WatchSource:0}: Error finding container 516462a2c2df2a07fdeebd5df3bfc8e877f30d5269481ffcccda04aad4beb63d: Status 404 returned error can't find the container with id 516462a2c2df2a07fdeebd5df3bfc8e877f30d5269481ffcccda04aad4beb63d Apr 16 18:10:54.184240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:54.184197 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zkgtz" event={"ID":"f60a69fc-3609-441f-9761-47098d24b5d0","Type":"ContainerStarted","Data":"5e34e25d4561c3981442ee5db3e6cc828e8cbfe9d3a8218afac1e37fcf27e2da"} Apr 16 18:10:54.185141 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:54.185116 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r62m9" event={"ID":"dda202b0-970a-4797-9f1d-010604ebe152","Type":"ContainerStarted","Data":"516462a2c2df2a07fdeebd5df3bfc8e877f30d5269481ffcccda04aad4beb63d"} Apr 16 18:10:54.186274 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:54.186248 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d9r4d" event={"ID":"22d015bf-6f4c-43a3-9b78-b7db19a716eb","Type":"ContainerStarted","Data":"0d873ec0f89931f58ab10e5fb0ecade3278bda56abe9fa765737689de9f1c0ae"} Apr 16 18:10:54.186274 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:54.186274 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d9r4d" event={"ID":"22d015bf-6f4c-43a3-9b78-b7db19a716eb","Type":"ContainerStarted","Data":"9ed358bacc62fee9a2a2586383d7c03f73ae9cafd24769d366cfd82431e2f148"} Apr 16 18:10:54.987173 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:54.987138 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:10:54.987333 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:54.987138 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:10:54.989550 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:54.989529 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:10:54.989847 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:54.989831 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:10:54.990493 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:54.990478 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:10:54.990595 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:54.990506 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vqckd\"" Apr 16 18:10:54.990664 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:54.990511 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w4nfv\"" Apr 16 18:10:56.747790 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.747759 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-qjv4v"] Apr 16 18:10:56.774460 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.774419 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-qjv4v"] Apr 16 18:10:56.774635 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.774606 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:56.777566 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.777424 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 18:10:56.777566 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.777438 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:10:56.777566 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.777445 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-tbfts\"" Apr 16 18:10:56.777566 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.777462 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:10:56.777566 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.777439 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 18:10:56.777566 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.777489 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:10:56.882691 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.882655 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/add4f806-9183-43ee-8bbc-9162c6bd6dc1-metrics-client-ca\") pod \"prometheus-operator-78f957474d-qjv4v\" (UID: \"add4f806-9183-43ee-8bbc-9162c6bd6dc1\") " pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:56.882871 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.882738 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/add4f806-9183-43ee-8bbc-9162c6bd6dc1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-qjv4v\" (UID: \"add4f806-9183-43ee-8bbc-9162c6bd6dc1\") " pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:56.882871 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.882783 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/add4f806-9183-43ee-8bbc-9162c6bd6dc1-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-qjv4v\" (UID: \"add4f806-9183-43ee-8bbc-9162c6bd6dc1\") " pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:56.882950 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.882875 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zwqq\" (UniqueName: \"kubernetes.io/projected/add4f806-9183-43ee-8bbc-9162c6bd6dc1-kube-api-access-5zwqq\") pod \"prometheus-operator-78f957474d-qjv4v\" (UID: \"add4f806-9183-43ee-8bbc-9162c6bd6dc1\") " pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:56.984136 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.984036 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/add4f806-9183-43ee-8bbc-9162c6bd6dc1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-qjv4v\" (UID: \"add4f806-9183-43ee-8bbc-9162c6bd6dc1\") " pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:56.984136 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.984091 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/add4f806-9183-43ee-8bbc-9162c6bd6dc1-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-qjv4v\" (UID: \"add4f806-9183-43ee-8bbc-9162c6bd6dc1\") " pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:56.984334 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.984137 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zwqq\" (UniqueName: \"kubernetes.io/projected/add4f806-9183-43ee-8bbc-9162c6bd6dc1-kube-api-access-5zwqq\") pod \"prometheus-operator-78f957474d-qjv4v\" (UID: \"add4f806-9183-43ee-8bbc-9162c6bd6dc1\") " pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:56.984742 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.984323 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/add4f806-9183-43ee-8bbc-9162c6bd6dc1-metrics-client-ca\") pod \"prometheus-operator-78f957474d-qjv4v\" (UID: \"add4f806-9183-43ee-8bbc-9162c6bd6dc1\") " pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:56.988519 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.985472 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/add4f806-9183-43ee-8bbc-9162c6bd6dc1-metrics-client-ca\") pod \"prometheus-operator-78f957474d-qjv4v\" (UID: \"add4f806-9183-43ee-8bbc-9162c6bd6dc1\") " pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:56.988519 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.987527 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/add4f806-9183-43ee-8bbc-9162c6bd6dc1-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-qjv4v\" (UID: \"add4f806-9183-43ee-8bbc-9162c6bd6dc1\") " pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:56.988519 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.987551 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/add4f806-9183-43ee-8bbc-9162c6bd6dc1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-qjv4v\" (UID: \"add4f806-9183-43ee-8bbc-9162c6bd6dc1\") " pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:56.993527 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:56.993479 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zwqq\" (UniqueName: \"kubernetes.io/projected/add4f806-9183-43ee-8bbc-9162c6bd6dc1-kube-api-access-5zwqq\") pod \"prometheus-operator-78f957474d-qjv4v\" (UID: \"add4f806-9183-43ee-8bbc-9162c6bd6dc1\") " pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:57.083932 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.083894 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" Apr 16 18:10:57.196214 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.196174 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r62m9" event={"ID":"dda202b0-970a-4797-9f1d-010604ebe152","Type":"ContainerStarted","Data":"20e6be9aa69b02d1e070b0e4cc576da351a7afc54e8fa6e0dafffae39af1796c"} Apr 16 18:10:57.198335 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.198296 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d9r4d" event={"ID":"22d015bf-6f4c-43a3-9b78-b7db19a716eb","Type":"ContainerStarted","Data":"3454f0cdb40d78e1b7dee6a57c1c69dfd6408dfc7d8c1f406714c9b08cc9edd1"} Apr 16 18:10:57.201178 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.201152 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zkgtz" event={"ID":"f60a69fc-3609-441f-9761-47098d24b5d0","Type":"ContainerStarted","Data":"3a91b15ac489515967a739a8bb736f310d43238cfbb68579703d424ce67f0bc4"} Apr 16 18:10:57.201282 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.201187 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zkgtz" event={"ID":"f60a69fc-3609-441f-9761-47098d24b5d0","Type":"ContainerStarted","Data":"b6f1e3e9df3f6bbbf7ce6df6adf9c7d8ffb4e2574f3c31eae8462540951deb05"} Apr 16 18:10:57.201631 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.201616 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zkgtz" Apr 16 18:10:57.213828 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.213780 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r62m9" podStartSLOduration=1.604917948 podStartE2EDuration="4.213762699s" podCreationTimestamp="2026-04-16 18:10:53 +0000 UTC" firstStartedPulling="2026-04-16 18:10:53.847190925 +0000 UTC m=+54.436389190" lastFinishedPulling="2026-04-16 18:10:56.456035662 +0000 UTC m=+57.045233941" observedRunningTime="2026-04-16 18:10:57.212621398 +0000 UTC m=+57.801819683" watchObservedRunningTime="2026-04-16 18:10:57.213762699 +0000 UTC m=+57.802960990" Apr 16 18:10:57.221747 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.221702 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-qjv4v"] Apr 16 18:10:57.225242 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:57.225217 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadd4f806_9183_43ee_8bbc_9162c6bd6dc1.slice/crio-4a435f03024975eafbd77af599bb3835900b31472dc1cde45deffc377b9da3a7 WatchSource:0}: Error finding container 4a435f03024975eafbd77af599bb3835900b31472dc1cde45deffc377b9da3a7: Status 404 returned error can't find the container with id 4a435f03024975eafbd77af599bb3835900b31472dc1cde45deffc377b9da3a7 Apr 16 18:10:57.235646 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.234132 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zkgtz" podStartSLOduration=1.603184759 podStartE2EDuration="4.234116839s" podCreationTimestamp="2026-04-16 18:10:53 +0000 UTC" firstStartedPulling="2026-04-16 18:10:53.820129314 +0000 UTC m=+54.409327592" lastFinishedPulling="2026-04-16 18:10:56.451061409 +0000 UTC m=+57.040259672" observedRunningTime="2026-04-16 18:10:57.233950069 +0000 UTC m=+57.823148356" watchObservedRunningTime="2026-04-16 18:10:57.234116839 +0000 UTC m=+57.823315126" Apr 16 18:10:57.259177 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.259112 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f576946cd-gdtrr"] Apr 16 18:10:57.264237 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.264214 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.266756 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.266500 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:10:57.266756 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.266547 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:10:57.266756 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.266610 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:10:57.266756 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.266671 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:10:57.267018 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.266799 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:10:57.267090 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.267059 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:10:57.267256 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.267238 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:10:57.267367 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.267246 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-szppp\"" Apr 16 18:10:57.269938 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.269892 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f576946cd-gdtrr"] Apr 16 18:10:57.388500 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.388467 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-serving-cert\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.388500 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.388502 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-service-ca\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.388805 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.388532 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rxp\" (UniqueName: \"kubernetes.io/projected/9d3db23b-0ff6-4a77-b594-d78e3a392d67-kube-api-access-94rxp\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.388887 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.388818 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-oauth-config\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.388887 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.388879 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-oauth-serving-cert\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.389024 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.388916 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-config\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.489698 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.489617 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-oauth-serving-cert\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.489698 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.489662 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-config\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.489909 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.489702 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-serving-cert\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.489909 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.489720 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-service-ca\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.489909 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.489744 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94rxp\" (UniqueName: \"kubernetes.io/projected/9d3db23b-0ff6-4a77-b594-d78e3a392d67-kube-api-access-94rxp\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.489909 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.489821 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-oauth-config\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.490594 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.490460 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-oauth-serving-cert\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.490842 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.490820 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-config\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.491043 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.491020 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-service-ca\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.492671 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.492652 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-serving-cert\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.492750 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.492704 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-oauth-config\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.497902 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.497878 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rxp\" (UniqueName: \"kubernetes.io/projected/9d3db23b-0ff6-4a77-b594-d78e3a392d67-kube-api-access-94rxp\") pod \"console-6f576946cd-gdtrr\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.577060 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.577007 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:10:57.972604 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:57.972560 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f576946cd-gdtrr"] Apr 16 18:10:57.976390 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:10:57.976354 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d3db23b_0ff6_4a77_b594_d78e3a392d67.slice/crio-a19b4b495b84b437226fb58e9e9535d9a50f02644bf3b0896099fcec08db2649 WatchSource:0}: Error finding container a19b4b495b84b437226fb58e9e9535d9a50f02644bf3b0896099fcec08db2649: Status 404 returned error can't find the container with id a19b4b495b84b437226fb58e9e9535d9a50f02644bf3b0896099fcec08db2649 Apr 16 18:10:58.204881 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:58.204838 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f576946cd-gdtrr" event={"ID":"9d3db23b-0ff6-4a77-b594-d78e3a392d67","Type":"ContainerStarted","Data":"a19b4b495b84b437226fb58e9e9535d9a50f02644bf3b0896099fcec08db2649"} Apr 16 18:10:58.205976 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:58.205946 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" event={"ID":"add4f806-9183-43ee-8bbc-9162c6bd6dc1","Type":"ContainerStarted","Data":"4a435f03024975eafbd77af599bb3835900b31472dc1cde45deffc377b9da3a7"} Apr 16 18:10:58.207959 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:58.207934 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d9r4d" event={"ID":"22d015bf-6f4c-43a3-9b78-b7db19a716eb","Type":"ContainerStarted","Data":"388c264bedbd16a460df11ea47e722571936a2821a187251b4d4af21dc0c154c"} Apr 16 18:10:58.225060 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:58.225002 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-d9r4d" podStartSLOduration=1.310218697 podStartE2EDuration="5.224984204s" podCreationTimestamp="2026-04-16 18:10:53 +0000 UTC" firstStartedPulling="2026-04-16 18:10:53.991783853 +0000 UTC m=+54.580982118" lastFinishedPulling="2026-04-16 18:10:57.906549358 +0000 UTC m=+58.495747625" observedRunningTime="2026-04-16 18:10:58.224623852 +0000 UTC m=+58.813822139" watchObservedRunningTime="2026-04-16 18:10:58.224984204 +0000 UTC m=+58.814182483" Apr 16 18:10:59.141466 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:59.141439 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k7ztx" Apr 16 18:10:59.213241 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:59.213186 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" event={"ID":"add4f806-9183-43ee-8bbc-9162c6bd6dc1","Type":"ContainerStarted","Data":"0a52d165477326322485e44a4c20ddd51b70ab9a2e0044ab0671b7b4d9f8ba08"} Apr 16 18:10:59.213241 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:59.213240 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" event={"ID":"add4f806-9183-43ee-8bbc-9162c6bd6dc1","Type":"ContainerStarted","Data":"777b9ed74816b575e8fdec315802ce380d5776569b095d11d181edd75f8107b1"} Apr 16 18:10:59.231255 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:10:59.231198 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-qjv4v" podStartSLOduration=1.761836879 podStartE2EDuration="3.231179971s" podCreationTimestamp="2026-04-16 18:10:56 +0000 UTC" firstStartedPulling="2026-04-16 18:10:57.227935127 +0000 UTC m=+57.817133390" lastFinishedPulling="2026-04-16 18:10:58.697278201 +0000 UTC m=+59.286476482" observedRunningTime="2026-04-16 18:10:59.230279076 +0000 UTC m=+59.819477363" watchObservedRunningTime="2026-04-16 18:10:59.231179971 +0000 UTC m=+59.820378259" Apr 16 18:11:01.118401 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.118358 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc"] Apr 16 18:11:01.126924 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.126897 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-j95hp"] Apr 16 18:11:01.127108 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.127088 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.129263 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.129241 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 18:11:01.129393 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.129270 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:11:01.129543 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.129527 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-f7fjn\"" Apr 16 18:11:01.130855 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.130835 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.134808 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.134183 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:11:01.134808 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.134436 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rnk47\"" Apr 16 18:11:01.134808 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.134643 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:11:01.134808 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.134709 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:11:01.135459 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.135440 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc"] Apr 16 18:11:01.219190 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.219158 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-textfile\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.219389 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.219228 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.219389 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.219273 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5cf0e6-a55b-42f8-b609-f62502a6a48b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-9wqsc\" (UID: \"bf5cf0e6-a55b-42f8-b609-f62502a6a48b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.219389 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.219296 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bf5cf0e6-a55b-42f8-b609-f62502a6a48b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-9wqsc\" (UID: \"bf5cf0e6-a55b-42f8-b609-f62502a6a48b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.219389 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.219316 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/935b67f9-01a4-4c36-99a5-76ff15afe07f-metrics-client-ca\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.219389 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.219333 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf5cf0e6-a55b-42f8-b609-f62502a6a48b-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-9wqsc\" (UID: \"bf5cf0e6-a55b-42f8-b609-f62502a6a48b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.219389 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.219351 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/935b67f9-01a4-4c36-99a5-76ff15afe07f-root\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.219389 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.219379 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-accelerators-collector-config\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.219741 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.219464 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-wtmp\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.219741 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.219497 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/935b67f9-01a4-4c36-99a5-76ff15afe07f-sys\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.219741 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.219529 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-tls\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.219741 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.219611 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szk6p\" (UniqueName: \"kubernetes.io/projected/935b67f9-01a4-4c36-99a5-76ff15afe07f-kube-api-access-szk6p\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.219741 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.219643 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swb79\" (UniqueName: \"kubernetes.io/projected/bf5cf0e6-a55b-42f8-b609-f62502a6a48b-kube-api-access-swb79\") pod \"openshift-state-metrics-5669946b84-9wqsc\" (UID: \"bf5cf0e6-a55b-42f8-b609-f62502a6a48b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.320547 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320509 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf5cf0e6-a55b-42f8-b609-f62502a6a48b-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-9wqsc\" (UID: \"bf5cf0e6-a55b-42f8-b609-f62502a6a48b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.320547 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320547 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/935b67f9-01a4-4c36-99a5-76ff15afe07f-root\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.320778 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320568 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-accelerators-collector-config\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.320778 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320613 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-wtmp\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.320778 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320635 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/935b67f9-01a4-4c36-99a5-76ff15afe07f-sys\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.320778 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320648 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/935b67f9-01a4-4c36-99a5-76ff15afe07f-root\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.320778 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320655 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-tls\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.320778 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320721 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szk6p\" (UniqueName: \"kubernetes.io/projected/935b67f9-01a4-4c36-99a5-76ff15afe07f-kube-api-access-szk6p\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.320778 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:11:01.320731 2583 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 18:11:01.320778 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320753 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swb79\" (UniqueName: \"kubernetes.io/projected/bf5cf0e6-a55b-42f8-b609-f62502a6a48b-kube-api-access-swb79\") pod \"openshift-state-metrics-5669946b84-9wqsc\" (UID: \"bf5cf0e6-a55b-42f8-b609-f62502a6a48b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.321170 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320789 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-wtmp\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.321170 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320845 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/935b67f9-01a4-4c36-99a5-76ff15afe07f-sys\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.321170 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:11:01.320798 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-tls podName:935b67f9-01a4-4c36-99a5-76ff15afe07f nodeName:}" failed. No retries permitted until 2026-04-16 18:11:01.820777493 +0000 UTC m=+62.409975777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-tls") pod "node-exporter-j95hp" (UID: "935b67f9-01a4-4c36-99a5-76ff15afe07f") : secret "node-exporter-tls" not found Apr 16 18:11:01.321170 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320880 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-textfile\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.321170 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320936 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.321170 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.320995 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5cf0e6-a55b-42f8-b609-f62502a6a48b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-9wqsc\" (UID: \"bf5cf0e6-a55b-42f8-b609-f62502a6a48b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.321170 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.321024 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bf5cf0e6-a55b-42f8-b609-f62502a6a48b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-9wqsc\" (UID: \"bf5cf0e6-a55b-42f8-b609-f62502a6a48b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.321170 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.321053 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/935b67f9-01a4-4c36-99a5-76ff15afe07f-metrics-client-ca\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.321566 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.321239 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-accelerators-collector-config\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.321566 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.321259 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bf5cf0e6-a55b-42f8-b609-f62502a6a48b-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-9wqsc\" (UID: \"bf5cf0e6-a55b-42f8-b609-f62502a6a48b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.321566 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.321408 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-textfile\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.321732 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.321634 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/935b67f9-01a4-4c36-99a5-76ff15afe07f-metrics-client-ca\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.323832 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.323807 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf5cf0e6-a55b-42f8-b609-f62502a6a48b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-9wqsc\" (UID: \"bf5cf0e6-a55b-42f8-b609-f62502a6a48b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.323984 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.323813 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.323984 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.323844 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bf5cf0e6-a55b-42f8-b609-f62502a6a48b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-9wqsc\" (UID: \"bf5cf0e6-a55b-42f8-b609-f62502a6a48b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.329935 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.329908 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szk6p\" (UniqueName: \"kubernetes.io/projected/935b67f9-01a4-4c36-99a5-76ff15afe07f-kube-api-access-szk6p\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.330038 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.329906 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swb79\" (UniqueName: \"kubernetes.io/projected/bf5cf0e6-a55b-42f8-b609-f62502a6a48b-kube-api-access-swb79\") pod \"openshift-state-metrics-5669946b84-9wqsc\" (UID: \"bf5cf0e6-a55b-42f8-b609-f62502a6a48b\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.441235 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.441197 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" Apr 16 18:11:01.568354 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.568320 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc"] Apr 16 18:11:01.571600 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:11:01.571549 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf5cf0e6_a55b_42f8_b609_f62502a6a48b.slice/crio-00ad0c78c39ba5f707b8da580f8edad0dcf92560dd803b2dc8fe37b42db3a221 WatchSource:0}: Error finding container 00ad0c78c39ba5f707b8da580f8edad0dcf92560dd803b2dc8fe37b42db3a221: Status 404 returned error can't find the container with id 00ad0c78c39ba5f707b8da580f8edad0dcf92560dd803b2dc8fe37b42db3a221 Apr 16 18:11:01.825707 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.825657 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-tls\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:01.828102 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:01.828080 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/935b67f9-01a4-4c36-99a5-76ff15afe07f-node-exporter-tls\") pod \"node-exporter-j95hp\" (UID: \"935b67f9-01a4-4c36-99a5-76ff15afe07f\") " pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:02.047262 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.047229 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-j95hp" Apr 16 18:11:02.055396 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:11:02.055361 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod935b67f9_01a4_4c36_99a5_76ff15afe07f.slice/crio-52b84a9de23525210672cfc78d67b53cf4c3c85f5d2964eb80a04d35ca84aa9d WatchSource:0}: Error finding container 52b84a9de23525210672cfc78d67b53cf4c3c85f5d2964eb80a04d35ca84aa9d: Status 404 returned error can't find the container with id 52b84a9de23525210672cfc78d67b53cf4c3c85f5d2964eb80a04d35ca84aa9d Apr 16 18:11:02.198037 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.195800 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:11:02.205314 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.205282 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.207667 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.207640 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:11:02.207869 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.207666 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:11:02.207869 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.207676 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:11:02.207869 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.207677 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:11:02.208093 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.207916 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:11:02.208190 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.208108 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:11:02.208190 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.208147 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-hnrnw\"" Apr 16 18:11:02.208190 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.208167 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:11:02.208404 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.208388 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:11:02.208476 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.208392 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:11:02.216877 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.216852 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:11:02.223623 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.223592 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j95hp" event={"ID":"935b67f9-01a4-4c36-99a5-76ff15afe07f","Type":"ContainerStarted","Data":"52b84a9de23525210672cfc78d67b53cf4c3c85f5d2964eb80a04d35ca84aa9d"} Apr 16 18:11:02.226275 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.226246 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" event={"ID":"bf5cf0e6-a55b-42f8-b609-f62502a6a48b","Type":"ContainerStarted","Data":"966c476feca12d2ee75a6aa678fbc08b45c0cda5e42f91a3004ebc554d5df555"} Apr 16 18:11:02.226384 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.226278 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" event={"ID":"bf5cf0e6-a55b-42f8-b609-f62502a6a48b","Type":"ContainerStarted","Data":"73f5fbed238aacc5ef1adc1c6ed464b920528b351d4c79a9dcff4851bda28398"} Apr 16 18:11:02.226384 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.226289 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" event={"ID":"bf5cf0e6-a55b-42f8-b609-f62502a6a48b","Type":"ContainerStarted","Data":"00ad0c78c39ba5f707b8da580f8edad0dcf92560dd803b2dc8fe37b42db3a221"} Apr 16 18:11:02.227696 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.227674 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f576946cd-gdtrr" event={"ID":"9d3db23b-0ff6-4a77-b594-d78e3a392d67","Type":"ContainerStarted","Data":"47826becc281edeed10688c2a92f26d76a38200fb8d70674a3021d149a3a1cf8"} Apr 16 18:11:02.229084 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.229058 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.229228 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.229105 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.229228 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.229136 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-config-volume\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.229228 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.229166 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.229479 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.229226 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tms4z\" (UniqueName: \"kubernetes.io/projected/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-kube-api-access-tms4z\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.229479 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.229262 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.229479 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.229300 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-config-out\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.229479 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.229329 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-web-config\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.229479 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.229361 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.229479 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.229388 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.229479 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.229414 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.229479 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.229441 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.229479 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.229474 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.246648 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.246571 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f576946cd-gdtrr" podStartSLOduration=1.878427518 podStartE2EDuration="5.24655505s" podCreationTimestamp="2026-04-16 18:10:57 +0000 UTC" firstStartedPulling="2026-04-16 18:10:57.978428041 +0000 UTC m=+58.567626304" lastFinishedPulling="2026-04-16 18:11:01.34655557 +0000 UTC m=+61.935753836" observedRunningTime="2026-04-16 18:11:02.245170711 +0000 UTC m=+62.834369019" watchObservedRunningTime="2026-04-16 18:11:02.24655505 +0000 UTC m=+62.835753336" Apr 16 18:11:02.329945 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.329865 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tms4z\" (UniqueName: \"kubernetes.io/projected/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-kube-api-access-tms4z\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.330104 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.329983 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.330104 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.330023 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-config-out\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.330104 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.330049 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-web-config\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.330104 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.330089 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.330330 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.330110 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.330330 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.330127 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.330330 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.330142 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.330330 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.330179 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.330330 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.330197 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.330330 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.330214 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.330330 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.330234 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-config-volume\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.330330 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.330269 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.330330 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:11:02.330327 2583 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 18:11:02.330864 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:11:02.330386 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-main-tls podName:7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:02.830368626 +0000 UTC m=+63.419566893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70") : secret "alertmanager-main-tls" not found Apr 16 18:11:02.330864 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:11:02.330827 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-trusted-ca-bundle podName:7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70 nodeName:}" failed. No retries permitted until 2026-04-16 18:11:02.83081027 +0000 UTC m=+63.420008549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70") : configmap references non-existent config key: ca-bundle.crt Apr 16 18:11:02.331235 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.331209 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.331478 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.331456 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.334618 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.334556 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-config-out\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.335211 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.335165 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.335312 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.335179 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.335312 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.335235 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-config-volume\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.335765 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.335740 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-web-config\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.336547 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.336521 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.337482 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.337450 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.337834 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.337637 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.338685 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.338664 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tms4z\" (UniqueName: \"kubernetes.io/projected/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-kube-api-access-tms4z\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.834928 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.834892 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.835100 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.834950 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.835702 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.835680 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:02.837398 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:02.837380 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:03.120949 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:03.120867 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:11:03.523939 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:03.523903 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:11:03.527901 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:11:03.527867 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b14e9cb_a8aa_41ea_8a56_efa56a9eaa70.slice/crio-e52660f49111f26c4fdd93ff38ea20485bbde8cd921e9fde29ab98122128f470 WatchSource:0}: Error finding container e52660f49111f26c4fdd93ff38ea20485bbde8cd921e9fde29ab98122128f470: Status 404 returned error can't find the container with id e52660f49111f26c4fdd93ff38ea20485bbde8cd921e9fde29ab98122128f470 Apr 16 18:11:03.853765 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:03.853730 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58bfbc7597-swr47"] Apr 16 18:11:03.870931 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:03.870905 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58bfbc7597-swr47"] Apr 16 18:11:03.871035 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:03.871029 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:03.883134 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:03.883107 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:11:03.946765 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:03.946711 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee9044a0-b743-4093-837d-ed37995d275b-console-serving-cert\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:03.946765 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:03.946763 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-trusted-ca-bundle\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:03.946968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:03.946783 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ee9044a0-b743-4093-837d-ed37995d275b-console-oauth-config\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:03.946968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:03.946853 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-oauth-serving-cert\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:03.946968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:03.946872 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-service-ca\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:03.946968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:03.946927 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkxt8\" (UniqueName: \"kubernetes.io/projected/ee9044a0-b743-4093-837d-ed37995d275b-kube-api-access-jkxt8\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:03.947090 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:03.946981 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-console-config\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.047872 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.047833 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-oauth-serving-cert\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.048037 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.047912 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-service-ca\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.048037 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.048017 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkxt8\" (UniqueName: \"kubernetes.io/projected/ee9044a0-b743-4093-837d-ed37995d275b-kube-api-access-jkxt8\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.048155 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.048049 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-console-config\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.048155 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.048077 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee9044a0-b743-4093-837d-ed37995d275b-console-serving-cert\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.048155 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.048111 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-trusted-ca-bundle\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.048155 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.048140 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ee9044a0-b743-4093-837d-ed37995d275b-console-oauth-config\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.048866 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.048803 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-service-ca\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.049067 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.049036 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-trusted-ca-bundle\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.049180 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.049162 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-console-config\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.050924 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.050901 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ee9044a0-b743-4093-837d-ed37995d275b-console-oauth-config\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.051304 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.051279 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee9044a0-b743-4093-837d-ed37995d275b-console-serving-cert\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.051717 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.051694 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-oauth-serving-cert\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.058656 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.058633 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkxt8\" (UniqueName: \"kubernetes.io/projected/ee9044a0-b743-4093-837d-ed37995d275b-kube-api-access-jkxt8\") pod \"console-58bfbc7597-swr47\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.182500 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.182461 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:04.234278 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.234238 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerStarted","Data":"e52660f49111f26c4fdd93ff38ea20485bbde8cd921e9fde29ab98122128f470"} Apr 16 18:11:04.238944 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.238896 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" event={"ID":"bf5cf0e6-a55b-42f8-b609-f62502a6a48b","Type":"ContainerStarted","Data":"f5c13b1271313205f63230bf53c10f497c6b12908a356364771fb57b82fe2eff"} Apr 16 18:11:04.241568 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.241538 2583 generic.go:358] "Generic (PLEG): container finished" podID="935b67f9-01a4-4c36-99a5-76ff15afe07f" containerID="bcf9c950f527baed09bf213d9ffc96cd1e5f855902665da71af0adc2c88044ed" exitCode=0 Apr 16 18:11:04.241720 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.241616 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j95hp" event={"ID":"935b67f9-01a4-4c36-99a5-76ff15afe07f","Type":"ContainerDied","Data":"bcf9c950f527baed09bf213d9ffc96cd1e5f855902665da71af0adc2c88044ed"} Apr 16 18:11:04.256872 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.256823 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-9wqsc" podStartSLOduration=1.53724923 podStartE2EDuration="3.256808708s" podCreationTimestamp="2026-04-16 18:11:01 +0000 UTC" firstStartedPulling="2026-04-16 18:11:01.679719489 +0000 UTC m=+62.268917753" lastFinishedPulling="2026-04-16 18:11:03.399278967 +0000 UTC m=+63.988477231" observedRunningTime="2026-04-16 18:11:04.256618244 +0000 UTC m=+64.845816531" watchObservedRunningTime="2026-04-16 18:11:04.256808708 +0000 UTC m=+64.846006988" Apr 16 18:11:04.312166 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:04.312109 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58bfbc7597-swr47"] Apr 16 18:11:04.319598 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:11:04.319555 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee9044a0_b743_4093_837d_ed37995d275b.slice/crio-7edb45054febe258eba3604f2488df268b915852c9afe425fe9e69240ee3c46f WatchSource:0}: Error finding container 7edb45054febe258eba3604f2488df268b915852c9afe425fe9e69240ee3c46f: Status 404 returned error can't find the container with id 7edb45054febe258eba3604f2488df268b915852c9afe425fe9e69240ee3c46f Apr 16 18:11:05.249139 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.249099 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerStarted","Data":"4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505"} Apr 16 18:11:05.250440 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.250409 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bfbc7597-swr47" event={"ID":"ee9044a0-b743-4093-837d-ed37995d275b","Type":"ContainerStarted","Data":"50b125db26972c8956b15ca013b41b48a9b1da20b5ebfda0d2ed6706768b5682"} Apr 16 18:11:05.250523 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.250447 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bfbc7597-swr47" event={"ID":"ee9044a0-b743-4093-837d-ed37995d275b","Type":"ContainerStarted","Data":"7edb45054febe258eba3604f2488df268b915852c9afe425fe9e69240ee3c46f"} Apr 16 18:11:05.252188 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.252159 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j95hp" event={"ID":"935b67f9-01a4-4c36-99a5-76ff15afe07f","Type":"ContainerStarted","Data":"8541f69e9c40061d510f1b2f2c26e14bf698f1930ec3dfacb9930bd2aca017b9"} Apr 16 18:11:05.252287 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.252194 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j95hp" event={"ID":"935b67f9-01a4-4c36-99a5-76ff15afe07f","Type":"ContainerStarted","Data":"34556dfb51258aaa6e7fffd1c55f4c9f7ac67c727718566875cb9394c9ac8d93"} Apr 16 18:11:05.266798 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.266750 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58bfbc7597-swr47" podStartSLOduration=2.266725009 podStartE2EDuration="2.266725009s" podCreationTimestamp="2026-04-16 18:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:05.26667338 +0000 UTC m=+65.855871680" watchObservedRunningTime="2026-04-16 18:11:05.266725009 +0000 UTC m=+65.855923293" Apr 16 18:11:05.281494 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.281447 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-j95hp" podStartSLOduration=2.590672732 podStartE2EDuration="4.281432576s" podCreationTimestamp="2026-04-16 18:11:01 +0000 UTC" firstStartedPulling="2026-04-16 18:11:02.057058995 +0000 UTC m=+62.646257260" lastFinishedPulling="2026-04-16 18:11:03.747818827 +0000 UTC m=+64.337017104" observedRunningTime="2026-04-16 18:11:05.281094846 +0000 UTC m=+65.870293132" watchObservedRunningTime="2026-04-16 18:11:05.281432576 +0000 UTC m=+65.870630868" Apr 16 18:11:05.764349 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.764255 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs\") pod \"network-metrics-daemon-chzqx\" (UID: \"4eacb341-6891-41dc-a3c0-09b5697178ee\") " pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:11:05.766358 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.766337 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:11:05.777372 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.777335 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eacb341-6891-41dc-a3c0-09b5697178ee-metrics-certs\") pod \"network-metrics-daemon-chzqx\" (UID: \"4eacb341-6891-41dc-a3c0-09b5697178ee\") " pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:11:05.804160 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.804124 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vqckd\"" Apr 16 18:11:05.812003 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.811981 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-chzqx" Apr 16 18:11:05.866149 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.866111 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpkv7\" (UniqueName: \"kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7\") pod \"network-check-target-9rmkx\" (UID: \"d252d242-5753-478c-9b07-d4b27eb2d3e8\") " pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:11:05.868299 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.868248 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:11:05.879193 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.879170 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:11:05.890989 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.890969 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpkv7\" (UniqueName: \"kubernetes.io/projected/d252d242-5753-478c-9b07-d4b27eb2d3e8-kube-api-access-qpkv7\") pod \"network-check-target-9rmkx\" (UID: \"d252d242-5753-478c-9b07-d4b27eb2d3e8\") " pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:11:05.934113 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:05.934081 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-chzqx"] Apr 16 18:11:05.938724 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:11:05.938697 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eacb341_6891_41dc_a3c0_09b5697178ee.slice/crio-f0a1c0f0bd0777488e94c9da949c20430c7cf8809233d93dfa6ce3867bd03838 WatchSource:0}: Error finding container f0a1c0f0bd0777488e94c9da949c20430c7cf8809233d93dfa6ce3867bd03838: Status 404 returned error can't find the container with id f0a1c0f0bd0777488e94c9da949c20430c7cf8809233d93dfa6ce3867bd03838 Apr 16 18:11:06.099305 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:06.099223 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w4nfv\"" Apr 16 18:11:06.107798 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:06.107778 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:11:06.225830 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:06.225793 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9rmkx"] Apr 16 18:11:06.229228 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:11:06.229189 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd252d242_5753_478c_9b07_d4b27eb2d3e8.slice/crio-839a03d7ad7fc697294101b87b601280ed902ec61b3952b36b0f0ce63b6f2f9d WatchSource:0}: Error finding container 839a03d7ad7fc697294101b87b601280ed902ec61b3952b36b0f0ce63b6f2f9d: Status 404 returned error can't find the container with id 839a03d7ad7fc697294101b87b601280ed902ec61b3952b36b0f0ce63b6f2f9d Apr 16 18:11:06.256234 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:06.256196 2583 generic.go:358] "Generic (PLEG): container finished" podID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerID="4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505" exitCode=0 Apr 16 18:11:06.256669 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:06.256242 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerDied","Data":"4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505"} Apr 16 18:11:06.257511 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:06.257488 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-chzqx" event={"ID":"4eacb341-6891-41dc-a3c0-09b5697178ee","Type":"ContainerStarted","Data":"f0a1c0f0bd0777488e94c9da949c20430c7cf8809233d93dfa6ce3867bd03838"} Apr 16 18:11:06.258492 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:06.258475 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9rmkx" event={"ID":"d252d242-5753-478c-9b07-d4b27eb2d3e8","Type":"ContainerStarted","Data":"839a03d7ad7fc697294101b87b601280ed902ec61b3952b36b0f0ce63b6f2f9d"} Apr 16 18:11:07.274609 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.271765 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-chzqx" event={"ID":"4eacb341-6891-41dc-a3c0-09b5697178ee","Type":"ContainerStarted","Data":"21863ca079e08d78a8398352dfcd54c85677a3b2754a4784b8e45af6df43dab8"} Apr 16 18:11:07.448958 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.448115 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:11:07.452431 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.452399 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.456122 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.455918 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:11:07.456122 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.455918 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:11:07.457635 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.456493 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:11:07.457635 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.456727 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4ecq72jksclj1\"" Apr 16 18:11:07.457635 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.456919 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:11:07.457635 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.457111 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:11:07.457635 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.457290 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:11:07.457635 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.457461 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:11:07.457635 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.457510 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:11:07.468911 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.464567 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:11:07.468911 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.465370 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:11:07.468911 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.465488 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:11:07.468911 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.465665 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:11:07.468911 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.465745 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-6c99d\"" Apr 16 18:11:07.468911 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.466042 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:11:07.468911 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.466145 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:11:07.482659 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.482492 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.482659 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.482546 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.482659 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.482599 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.482887 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.482657 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.482887 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.482715 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlr6h\" (UniqueName: \"kubernetes.io/projected/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-kube-api-access-vlr6h\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.482887 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.482753 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.482887 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.482818 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-config-out\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.482887 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.482845 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.482887 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.482881 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.483061 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.482921 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.483061 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.482978 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.483061 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.483008 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-config\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.483061 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.483034 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-web-config\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.483176 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.483061 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.483176 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.483154 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.483236 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.483180 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.483236 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.483212 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.483293 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.483235 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.577377 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.577327 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:11:07.577741 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.577696 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:11:07.586628 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.586482 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:11:07.586839 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.586819 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.586973 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.586858 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.586973 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.586887 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.586973 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.586919 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlr6h\" (UniqueName: \"kubernetes.io/projected/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-kube-api-access-vlr6h\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.586973 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.586946 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.587211 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.586975 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-config-out\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.587211 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.587000 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.587211 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.587027 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.587211 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.587053 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.587211 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.587087 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.587211 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.587118 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-config\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.587211 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.587146 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-web-config\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.587211 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.587178 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.587691 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.587233 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.587691 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.587259 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.587691 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.587292 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.587691 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.587318 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.587691 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.587365 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.591601 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.588393 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.591601 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.589003 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.591760 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.591690 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.592306 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.591929 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.592306 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.592257 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.593228 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.593203 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.597340 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.597302 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlr6h\" (UniqueName: \"kubernetes.io/projected/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-kube-api-access-vlr6h\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.597980 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.597961 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-web-config\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.598130 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.598110 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.599225 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.598448 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-config-out\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.599225 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.598852 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-config\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.599225 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.599192 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.599415 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.599347 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.599948 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.599813 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.599948 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.599909 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.603126 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.603070 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.603208 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.603150 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.603248 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.603222 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.776810 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.776724 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:07.954614 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:07.954547 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:11:08.216690 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:08.215907 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zkgtz" Apr 16 18:11:08.278972 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:08.278755 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerStarted","Data":"966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821"} Apr 16 18:11:08.278972 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:08.278918 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerStarted","Data":"d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949"} Apr 16 18:11:08.278972 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:08.278936 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerStarted","Data":"5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f"} Apr 16 18:11:08.278972 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:08.278949 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerStarted","Data":"f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318"} Apr 16 18:11:08.283715 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:08.282849 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-chzqx" event={"ID":"4eacb341-6891-41dc-a3c0-09b5697178ee","Type":"ContainerStarted","Data":"5143eb4c590e32477bf8fbdf291cba9b64b7b2ca8cbc813689fa1c46650f5455"} Apr 16 18:11:08.285286 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:08.285200 2583 generic.go:358] "Generic (PLEG): container finished" podID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerID="f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244" exitCode=0 Apr 16 18:11:08.285570 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:08.285450 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerDied","Data":"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244"} Apr 16 18:11:08.285570 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:08.285482 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerStarted","Data":"20a205898920a9478b60b2a0d53956afd19fde374a080ba991b9627052cd77e4"} Apr 16 18:11:08.291718 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:08.291571 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:11:08.299617 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:08.299480 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-chzqx" podStartSLOduration=67.279678159 podStartE2EDuration="1m8.29945968s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:11:05.940604792 +0000 UTC m=+66.529803056" lastFinishedPulling="2026-04-16 18:11:06.960386308 +0000 UTC m=+67.549584577" observedRunningTime="2026-04-16 18:11:08.29769826 +0000 UTC m=+68.886896558" watchObservedRunningTime="2026-04-16 18:11:08.29945968 +0000 UTC m=+68.888657968" Apr 16 18:11:08.853405 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:08.853370 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58bfbc7597-swr47"] Apr 16 18:11:09.294640 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:09.294562 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerStarted","Data":"b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7"} Apr 16 18:11:10.299110 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:10.299074 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9rmkx" event={"ID":"d252d242-5753-478c-9b07-d4b27eb2d3e8","Type":"ContainerStarted","Data":"a92cc4776cd2499ccc9aa78b2f286d58729d0e5d51f2ba4fe7b33da5a5a51adb"} Apr 16 18:11:10.299614 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:10.299162 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:11:10.302415 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:10.302382 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerStarted","Data":"0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02"} Apr 16 18:11:10.315412 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:10.315359 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9rmkx" podStartSLOduration=67.127418045 podStartE2EDuration="1m10.315341592s" podCreationTimestamp="2026-04-16 18:10:00 +0000 UTC" firstStartedPulling="2026-04-16 18:11:06.231053702 +0000 UTC m=+66.820251966" lastFinishedPulling="2026-04-16 18:11:09.418977247 +0000 UTC m=+70.008175513" observedRunningTime="2026-04-16 18:11:10.313094395 +0000 UTC m=+70.902292682" watchObservedRunningTime="2026-04-16 18:11:10.315341592 +0000 UTC m=+70.904539879" Apr 16 18:11:10.336277 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:10.336220 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.112866209 podStartE2EDuration="8.336205534s" podCreationTimestamp="2026-04-16 18:11:02 +0000 UTC" firstStartedPulling="2026-04-16 18:11:03.531365382 +0000 UTC m=+64.120563646" lastFinishedPulling="2026-04-16 18:11:09.754704707 +0000 UTC m=+70.343902971" observedRunningTime="2026-04-16 18:11:10.335224297 +0000 UTC m=+70.924422592" watchObservedRunningTime="2026-04-16 18:11:10.336205534 +0000 UTC m=+70.925403823" Apr 16 18:11:12.311636 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:12.311572 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerStarted","Data":"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32"} Apr 16 18:11:12.313812 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:12.311645 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerStarted","Data":"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007"} Apr 16 18:11:14.182622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:14.182560 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:14.321537 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:14.321497 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerStarted","Data":"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac"} Apr 16 18:11:14.321537 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:14.321544 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerStarted","Data":"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771"} Apr 16 18:11:14.321807 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:14.321558 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerStarted","Data":"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804"} Apr 16 18:11:14.321807 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:14.321569 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerStarted","Data":"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d"} Apr 16 18:11:14.350050 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:14.349941 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.638752634 podStartE2EDuration="7.349923873s" podCreationTimestamp="2026-04-16 18:11:07 +0000 UTC" firstStartedPulling="2026-04-16 18:11:08.287768186 +0000 UTC m=+68.876966456" lastFinishedPulling="2026-04-16 18:11:13.998939416 +0000 UTC m=+74.588137695" observedRunningTime="2026-04-16 18:11:14.34816339 +0000 UTC m=+74.937361675" watchObservedRunningTime="2026-04-16 18:11:14.349923873 +0000 UTC m=+74.939122150" Apr 16 18:11:17.777463 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:17.777414 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:19.282933 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:19.282897 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f576946cd-gdtrr"] Apr 16 18:11:33.875626 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:33.875554 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-58bfbc7597-swr47" podUID="ee9044a0-b743-4093-837d-ed37995d275b" containerName="console" containerID="cri-o://50b125db26972c8956b15ca013b41b48a9b1da20b5ebfda0d2ed6706768b5682" gracePeriod=15 Apr 16 18:11:34.116986 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.116964 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58bfbc7597-swr47_ee9044a0-b743-4093-837d-ed37995d275b/console/0.log" Apr 16 18:11:34.117101 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.117029 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:34.220040 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.220010 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-oauth-serving-cert\") pod \"ee9044a0-b743-4093-837d-ed37995d275b\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " Apr 16 18:11:34.220209 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.220048 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-service-ca\") pod \"ee9044a0-b743-4093-837d-ed37995d275b\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " Apr 16 18:11:34.220209 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.220077 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee9044a0-b743-4093-837d-ed37995d275b-console-serving-cert\") pod \"ee9044a0-b743-4093-837d-ed37995d275b\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " Apr 16 18:11:34.220209 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.220159 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-console-config\") pod \"ee9044a0-b743-4093-837d-ed37995d275b\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " Apr 16 18:11:34.220209 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.220193 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkxt8\" (UniqueName: \"kubernetes.io/projected/ee9044a0-b743-4093-837d-ed37995d275b-kube-api-access-jkxt8\") pod \"ee9044a0-b743-4093-837d-ed37995d275b\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " Apr 16 18:11:34.220356 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.220235 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ee9044a0-b743-4093-837d-ed37995d275b-console-oauth-config\") pod \"ee9044a0-b743-4093-837d-ed37995d275b\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " Apr 16 18:11:34.220356 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.220269 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-trusted-ca-bundle\") pod \"ee9044a0-b743-4093-837d-ed37995d275b\" (UID: \"ee9044a0-b743-4093-837d-ed37995d275b\") " Apr 16 18:11:34.220457 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.220436 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ee9044a0-b743-4093-837d-ed37995d275b" (UID: "ee9044a0-b743-4093-837d-ed37995d275b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:11:34.220498 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.220469 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-service-ca" (OuterVolumeSpecName: "service-ca") pod "ee9044a0-b743-4093-837d-ed37995d275b" (UID: "ee9044a0-b743-4093-837d-ed37995d275b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:11:34.220553 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.220524 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-oauth-serving-cert\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:11:34.220700 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.220676 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-console-config" (OuterVolumeSpecName: "console-config") pod "ee9044a0-b743-4093-837d-ed37995d275b" (UID: "ee9044a0-b743-4093-837d-ed37995d275b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:11:34.220812 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.220793 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ee9044a0-b743-4093-837d-ed37995d275b" (UID: "ee9044a0-b743-4093-837d-ed37995d275b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:11:34.222796 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.222777 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee9044a0-b743-4093-837d-ed37995d275b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ee9044a0-b743-4093-837d-ed37995d275b" (UID: "ee9044a0-b743-4093-837d-ed37995d275b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:34.222989 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.222963 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee9044a0-b743-4093-837d-ed37995d275b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ee9044a0-b743-4093-837d-ed37995d275b" (UID: "ee9044a0-b743-4093-837d-ed37995d275b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:34.223089 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.222985 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee9044a0-b743-4093-837d-ed37995d275b-kube-api-access-jkxt8" (OuterVolumeSpecName: "kube-api-access-jkxt8") pod "ee9044a0-b743-4093-837d-ed37995d275b" (UID: "ee9044a0-b743-4093-837d-ed37995d275b"). InnerVolumeSpecName "kube-api-access-jkxt8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:34.321288 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.321248 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-console-config\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:11:34.321288 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.321279 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jkxt8\" (UniqueName: \"kubernetes.io/projected/ee9044a0-b743-4093-837d-ed37995d275b-kube-api-access-jkxt8\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:11:34.321288 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.321289 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ee9044a0-b743-4093-837d-ed37995d275b-console-oauth-config\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:11:34.321288 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.321298 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-trusted-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:11:34.321542 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.321307 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ee9044a0-b743-4093-837d-ed37995d275b-service-ca\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:11:34.321542 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.321316 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee9044a0-b743-4093-837d-ed37995d275b-console-serving-cert\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:11:34.381591 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.381562 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58bfbc7597-swr47_ee9044a0-b743-4093-837d-ed37995d275b/console/0.log" Apr 16 18:11:34.381736 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.381626 2583 generic.go:358] "Generic (PLEG): container finished" podID="ee9044a0-b743-4093-837d-ed37995d275b" containerID="50b125db26972c8956b15ca013b41b48a9b1da20b5ebfda0d2ed6706768b5682" exitCode=2 Apr 16 18:11:34.381736 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.381697 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bfbc7597-swr47" Apr 16 18:11:34.381812 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.381693 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bfbc7597-swr47" event={"ID":"ee9044a0-b743-4093-837d-ed37995d275b","Type":"ContainerDied","Data":"50b125db26972c8956b15ca013b41b48a9b1da20b5ebfda0d2ed6706768b5682"} Apr 16 18:11:34.381812 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.381798 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bfbc7597-swr47" event={"ID":"ee9044a0-b743-4093-837d-ed37995d275b","Type":"ContainerDied","Data":"7edb45054febe258eba3604f2488df268b915852c9afe425fe9e69240ee3c46f"} Apr 16 18:11:34.381877 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.381813 2583 scope.go:117] "RemoveContainer" containerID="50b125db26972c8956b15ca013b41b48a9b1da20b5ebfda0d2ed6706768b5682" Apr 16 18:11:34.394533 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.394515 2583 scope.go:117] "RemoveContainer" containerID="50b125db26972c8956b15ca013b41b48a9b1da20b5ebfda0d2ed6706768b5682" Apr 16 18:11:34.394859 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:11:34.394835 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50b125db26972c8956b15ca013b41b48a9b1da20b5ebfda0d2ed6706768b5682\": container with ID starting with 50b125db26972c8956b15ca013b41b48a9b1da20b5ebfda0d2ed6706768b5682 not found: ID does not exist" containerID="50b125db26972c8956b15ca013b41b48a9b1da20b5ebfda0d2ed6706768b5682" Apr 16 18:11:34.394953 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.394884 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50b125db26972c8956b15ca013b41b48a9b1da20b5ebfda0d2ed6706768b5682"} err="failed to get container status \"50b125db26972c8956b15ca013b41b48a9b1da20b5ebfda0d2ed6706768b5682\": rpc error: code = NotFound desc = could not find container \"50b125db26972c8956b15ca013b41b48a9b1da20b5ebfda0d2ed6706768b5682\": container with ID starting with 50b125db26972c8956b15ca013b41b48a9b1da20b5ebfda0d2ed6706768b5682 not found: ID does not exist" Apr 16 18:11:34.403019 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.402998 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58bfbc7597-swr47"] Apr 16 18:11:34.406562 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:34.406541 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58bfbc7597-swr47"] Apr 16 18:11:35.991934 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:35.991904 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee9044a0-b743-4093-837d-ed37995d275b" path="/var/lib/kubelet/pods/ee9044a0-b743-4093-837d-ed37995d275b/volumes" Apr 16 18:11:41.308067 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:41.308036 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9rmkx" Apr 16 18:11:44.303437 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.303379 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6f576946cd-gdtrr" podUID="9d3db23b-0ff6-4a77-b594-d78e3a392d67" containerName="console" containerID="cri-o://47826becc281edeed10688c2a92f26d76a38200fb8d70674a3021d149a3a1cf8" gracePeriod=15 Apr 16 18:11:44.542977 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.542952 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f576946cd-gdtrr_9d3db23b-0ff6-4a77-b594-d78e3a392d67/console/0.log" Apr 16 18:11:44.543095 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.543010 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:11:44.601928 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.601841 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-oauth-serving-cert\") pod \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " Apr 16 18:11:44.601928 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.601898 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-oauth-config\") pod \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " Apr 16 18:11:44.601928 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.601929 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-service-ca\") pod \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " Apr 16 18:11:44.602188 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.601953 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-config\") pod \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " Apr 16 18:11:44.602188 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.601971 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-serving-cert\") pod \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " Apr 16 18:11:44.602188 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.602001 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94rxp\" (UniqueName: \"kubernetes.io/projected/9d3db23b-0ff6-4a77-b594-d78e3a392d67-kube-api-access-94rxp\") pod \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\" (UID: \"9d3db23b-0ff6-4a77-b594-d78e3a392d67\") " Apr 16 18:11:44.602409 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.602378 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9d3db23b-0ff6-4a77-b594-d78e3a392d67" (UID: "9d3db23b-0ff6-4a77-b594-d78e3a392d67"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:11:44.602483 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.602404 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-service-ca" (OuterVolumeSpecName: "service-ca") pod "9d3db23b-0ff6-4a77-b594-d78e3a392d67" (UID: "9d3db23b-0ff6-4a77-b594-d78e3a392d67"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:11:44.602483 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.602465 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-config" (OuterVolumeSpecName: "console-config") pod "9d3db23b-0ff6-4a77-b594-d78e3a392d67" (UID: "9d3db23b-0ff6-4a77-b594-d78e3a392d67"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:11:44.604309 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.604283 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9d3db23b-0ff6-4a77-b594-d78e3a392d67" (UID: "9d3db23b-0ff6-4a77-b594-d78e3a392d67"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:44.604413 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.604317 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9d3db23b-0ff6-4a77-b594-d78e3a392d67" (UID: "9d3db23b-0ff6-4a77-b594-d78e3a392d67"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:44.604413 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.604356 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d3db23b-0ff6-4a77-b594-d78e3a392d67-kube-api-access-94rxp" (OuterVolumeSpecName: "kube-api-access-94rxp") pod "9d3db23b-0ff6-4a77-b594-d78e3a392d67" (UID: "9d3db23b-0ff6-4a77-b594-d78e3a392d67"). InnerVolumeSpecName "kube-api-access-94rxp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:44.703070 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.703012 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-oauth-config\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:11:44.703070 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.703063 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-service-ca\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:11:44.703070 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.703073 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-config\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:11:44.703070 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.703081 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d3db23b-0ff6-4a77-b594-d78e3a392d67-console-serving-cert\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:11:44.703070 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.703091 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94rxp\" (UniqueName: \"kubernetes.io/projected/9d3db23b-0ff6-4a77-b594-d78e3a392d67-kube-api-access-94rxp\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:11:44.703371 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:44.703102 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d3db23b-0ff6-4a77-b594-d78e3a392d67-oauth-serving-cert\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:11:45.417971 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:45.417944 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f576946cd-gdtrr_9d3db23b-0ff6-4a77-b594-d78e3a392d67/console/0.log" Apr 16 18:11:45.418326 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:45.417981 2583 generic.go:358] "Generic (PLEG): container finished" podID="9d3db23b-0ff6-4a77-b594-d78e3a392d67" containerID="47826becc281edeed10688c2a92f26d76a38200fb8d70674a3021d149a3a1cf8" exitCode=2 Apr 16 18:11:45.418326 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:45.418034 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f576946cd-gdtrr" event={"ID":"9d3db23b-0ff6-4a77-b594-d78e3a392d67","Type":"ContainerDied","Data":"47826becc281edeed10688c2a92f26d76a38200fb8d70674a3021d149a3a1cf8"} Apr 16 18:11:45.418326 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:45.418061 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f576946cd-gdtrr" event={"ID":"9d3db23b-0ff6-4a77-b594-d78e3a392d67","Type":"ContainerDied","Data":"a19b4b495b84b437226fb58e9e9535d9a50f02644bf3b0896099fcec08db2649"} Apr 16 18:11:45.418326 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:45.418071 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f576946cd-gdtrr" Apr 16 18:11:45.418326 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:45.418078 2583 scope.go:117] "RemoveContainer" containerID="47826becc281edeed10688c2a92f26d76a38200fb8d70674a3021d149a3a1cf8" Apr 16 18:11:45.426625 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:45.426609 2583 scope.go:117] "RemoveContainer" containerID="47826becc281edeed10688c2a92f26d76a38200fb8d70674a3021d149a3a1cf8" Apr 16 18:11:45.426922 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:11:45.426893 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47826becc281edeed10688c2a92f26d76a38200fb8d70674a3021d149a3a1cf8\": container with ID starting with 47826becc281edeed10688c2a92f26d76a38200fb8d70674a3021d149a3a1cf8 not found: ID does not exist" containerID="47826becc281edeed10688c2a92f26d76a38200fb8d70674a3021d149a3a1cf8" Apr 16 18:11:45.427005 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:45.426919 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47826becc281edeed10688c2a92f26d76a38200fb8d70674a3021d149a3a1cf8"} err="failed to get container status \"47826becc281edeed10688c2a92f26d76a38200fb8d70674a3021d149a3a1cf8\": rpc error: code = NotFound desc = could not find container \"47826becc281edeed10688c2a92f26d76a38200fb8d70674a3021d149a3a1cf8\": container with ID starting with 47826becc281edeed10688c2a92f26d76a38200fb8d70674a3021d149a3a1cf8 not found: ID does not exist" Apr 16 18:11:45.437361 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:45.437329 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f576946cd-gdtrr"] Apr 16 18:11:45.442998 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:45.442975 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f576946cd-gdtrr"] Apr 16 18:11:45.991506 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:11:45.991472 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d3db23b-0ff6-4a77-b594-d78e3a392d67" path="/var/lib/kubelet/pods/9d3db23b-0ff6-4a77-b594-d78e3a392d67/volumes" Apr 16 18:12:07.777520 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:07.777476 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:07.797439 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:07.797415 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:08.498208 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:08.498180 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:21.484993 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:21.484959 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:12:21.485504 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:21.485438 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="alertmanager" containerID="cri-o://f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318" gracePeriod=120 Apr 16 18:12:21.485686 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:21.485539 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="config-reloader" containerID="cri-o://5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f" gracePeriod=120 Apr 16 18:12:21.485686 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:21.485530 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="kube-rbac-proxy-web" containerID="cri-o://d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949" gracePeriod=120 Apr 16 18:12:21.485686 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:21.485604 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="prom-label-proxy" containerID="cri-o://0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02" gracePeriod=120 Apr 16 18:12:21.485686 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:21.485551 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="kube-rbac-proxy-metric" containerID="cri-o://b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7" gracePeriod=120 Apr 16 18:12:21.485686 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:21.485632 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="kube-rbac-proxy" containerID="cri-o://966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821" gracePeriod=120 Apr 16 18:12:22.521113 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.521081 2583 generic.go:358] "Generic (PLEG): container finished" podID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerID="0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02" exitCode=0 Apr 16 18:12:22.521113 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.521104 2583 generic.go:358] "Generic (PLEG): container finished" podID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerID="966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821" exitCode=0 Apr 16 18:12:22.521113 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.521111 2583 generic.go:358] "Generic (PLEG): container finished" podID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerID="5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f" exitCode=0 Apr 16 18:12:22.521113 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.521116 2583 generic.go:358] "Generic (PLEG): container finished" podID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerID="f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318" exitCode=0 Apr 16 18:12:22.521574 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.521154 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerDied","Data":"0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02"} Apr 16 18:12:22.521574 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.521187 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerDied","Data":"966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821"} Apr 16 18:12:22.521574 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.521197 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerDied","Data":"5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f"} Apr 16 18:12:22.521574 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.521207 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerDied","Data":"f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318"} Apr 16 18:12:22.734158 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.734134 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:22.782382 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782288 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-trusted-ca-bundle\") pod \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " Apr 16 18:12:22.782382 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782342 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy-metric\") pod \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " Apr 16 18:12:22.782382 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782382 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-config-out\") pod \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " Apr 16 18:12:22.782689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782416 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-metrics-client-ca\") pod \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " Apr 16 18:12:22.782689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782459 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tms4z\" (UniqueName: \"kubernetes.io/projected/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-kube-api-access-tms4z\") pod \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " Apr 16 18:12:22.782689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782492 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-tls-assets\") pod \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " Apr 16 18:12:22.782689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782526 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-cluster-tls-config\") pod \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " Apr 16 18:12:22.782689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782552 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-web-config\") pod \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " Apr 16 18:12:22.782689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782599 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-main-db\") pod \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " Apr 16 18:12:22.782689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782644 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-config-volume\") pod \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " Apr 16 18:12:22.782689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782671 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-main-tls\") pod \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " Apr 16 18:12:22.783124 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782701 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy-web\") pod \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " Apr 16 18:12:22.783124 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782743 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy\") pod \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\" (UID: \"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70\") " Apr 16 18:12:22.783124 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782785 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:22.783124 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.782982 2583 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:22.784264 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.784196 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:22.784458 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.784432 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:22.788356 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.788103 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:22.788556 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.788529 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:22.788732 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.788559 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-config-out" (OuterVolumeSpecName: "config-out") pod "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:22.788835 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.788479 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-kube-api-access-tms4z" (OuterVolumeSpecName: "kube-api-access-tms4z") pod "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70"). InnerVolumeSpecName "kube-api-access-tms4z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:22.789084 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.789048 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:22.789226 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.789190 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:22.789794 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.789755 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-config-volume" (OuterVolumeSpecName: "config-volume") pod "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:22.790965 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.790936 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:22.793056 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.793010 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:22.799089 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.799062 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-web-config" (OuterVolumeSpecName: "web-config") pod "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" (UID: "7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:22.883891 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.883845 2583 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-cluster-tls-config\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:22.883891 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.883888 2583 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-web-config\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:22.884117 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.883902 2583 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-alertmanager-main-db\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:22.884117 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.883915 2583 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-config-volume\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:22.884117 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.883927 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-main-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:22.884117 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.883940 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:22.884117 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.883954 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:22.884117 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.883966 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:22.884117 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.883978 2583 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-config-out\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:22.884117 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.883989 2583 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-metrics-client-ca\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:22.884117 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.884001 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tms4z\" (UniqueName: \"kubernetes.io/projected/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-kube-api-access-tms4z\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:22.884117 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:22.884012 2583 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70-tls-assets\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:23.526794 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.526761 2583 generic.go:358] "Generic (PLEG): container finished" podID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerID="b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7" exitCode=0 Apr 16 18:12:23.526794 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.526785 2583 generic.go:358] "Generic (PLEG): container finished" podID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerID="d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949" exitCode=0 Apr 16 18:12:23.527285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.526813 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerDied","Data":"b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7"} Apr 16 18:12:23.527285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.526840 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerDied","Data":"d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949"} Apr 16 18:12:23.527285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.526850 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70","Type":"ContainerDied","Data":"e52660f49111f26c4fdd93ff38ea20485bbde8cd921e9fde29ab98122128f470"} Apr 16 18:12:23.527285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.526865 2583 scope.go:117] "RemoveContainer" containerID="0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02" Apr 16 18:12:23.527285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.526889 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.534124 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.534107 2583 scope.go:117] "RemoveContainer" containerID="b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7" Apr 16 18:12:23.543794 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.543780 2583 scope.go:117] "RemoveContainer" containerID="966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821" Apr 16 18:12:23.550126 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.550111 2583 scope.go:117] "RemoveContainer" containerID="d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949" Apr 16 18:12:23.555640 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.555620 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:12:23.556812 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.556662 2583 scope.go:117] "RemoveContainer" containerID="5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f" Apr 16 18:12:23.560727 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.560703 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:12:23.564054 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.564021 2583 scope.go:117] "RemoveContainer" containerID="f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318" Apr 16 18:12:23.570179 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.570162 2583 scope.go:117] "RemoveContainer" containerID="4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505" Apr 16 18:12:23.576109 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.576079 2583 scope.go:117] "RemoveContainer" containerID="0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02" Apr 16 18:12:23.576331 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:23.576312 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02\": container with ID starting with 0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02 not found: ID does not exist" containerID="0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02" Apr 16 18:12:23.576381 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.576340 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02"} err="failed to get container status \"0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02\": rpc error: code = NotFound desc = could not find container \"0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02\": container with ID starting with 0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02 not found: ID does not exist" Apr 16 18:12:23.576381 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.576370 2583 scope.go:117] "RemoveContainer" containerID="b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7" Apr 16 18:12:23.576643 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:23.576621 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7\": container with ID starting with b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7 not found: ID does not exist" containerID="b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7" Apr 16 18:12:23.576701 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.576654 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7"} err="failed to get container status \"b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7\": rpc error: code = NotFound desc = could not find container \"b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7\": container with ID starting with b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7 not found: ID does not exist" Apr 16 18:12:23.576701 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.576677 2583 scope.go:117] "RemoveContainer" containerID="966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821" Apr 16 18:12:23.576918 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:23.576903 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821\": container with ID starting with 966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821 not found: ID does not exist" containerID="966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821" Apr 16 18:12:23.576958 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.576922 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821"} err="failed to get container status \"966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821\": rpc error: code = NotFound desc = could not find container \"966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821\": container with ID starting with 966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821 not found: ID does not exist" Apr 16 18:12:23.576958 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.576936 2583 scope.go:117] "RemoveContainer" containerID="d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949" Apr 16 18:12:23.577147 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:23.577131 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949\": container with ID starting with d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949 not found: ID does not exist" containerID="d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949" Apr 16 18:12:23.577187 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.577152 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949"} err="failed to get container status \"d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949\": rpc error: code = NotFound desc = could not find container \"d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949\": container with ID starting with d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949 not found: ID does not exist" Apr 16 18:12:23.577187 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.577166 2583 scope.go:117] "RemoveContainer" containerID="5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f" Apr 16 18:12:23.577367 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:23.577352 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f\": container with ID starting with 5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f not found: ID does not exist" containerID="5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f" Apr 16 18:12:23.577408 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.577370 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f"} err="failed to get container status \"5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f\": rpc error: code = NotFound desc = could not find container \"5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f\": container with ID starting with 5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f not found: ID does not exist" Apr 16 18:12:23.577408 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.577384 2583 scope.go:117] "RemoveContainer" containerID="f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318" Apr 16 18:12:23.577698 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:23.577677 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318\": container with ID starting with f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318 not found: ID does not exist" containerID="f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318" Apr 16 18:12:23.577768 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.577697 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318"} err="failed to get container status \"f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318\": rpc error: code = NotFound desc = could not find container \"f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318\": container with ID starting with f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318 not found: ID does not exist" Apr 16 18:12:23.577768 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.577712 2583 scope.go:117] "RemoveContainer" containerID="4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505" Apr 16 18:12:23.577953 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:23.577937 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505\": container with ID starting with 4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505 not found: ID does not exist" containerID="4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505" Apr 16 18:12:23.577993 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.577958 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505"} err="failed to get container status \"4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505\": rpc error: code = NotFound desc = could not find container \"4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505\": container with ID starting with 4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505 not found: ID does not exist" Apr 16 18:12:23.577993 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.577972 2583 scope.go:117] "RemoveContainer" containerID="0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02" Apr 16 18:12:23.578153 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.578137 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02"} err="failed to get container status \"0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02\": rpc error: code = NotFound desc = could not find container \"0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02\": container with ID starting with 0b0ed0d15eb053e2e13ecf560b909d53a68072bcb451d3ffba9de023c2a30e02 not found: ID does not exist" Apr 16 18:12:23.578191 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.578153 2583 scope.go:117] "RemoveContainer" containerID="b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7" Apr 16 18:12:23.578360 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.578345 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7"} err="failed to get container status \"b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7\": rpc error: code = NotFound desc = could not find container \"b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7\": container with ID starting with b4efb2c1fcd5f73ee88a30c7fee1fa9c6e674e491247459493617b5df12424e7 not found: ID does not exist" Apr 16 18:12:23.578446 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.578360 2583 scope.go:117] "RemoveContainer" containerID="966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821" Apr 16 18:12:23.578547 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.578529 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821"} err="failed to get container status \"966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821\": rpc error: code = NotFound desc = could not find container \"966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821\": container with ID starting with 966484c9152b864c2beea28ca72a75588d54ed0c56815dabded9a9bef18b3821 not found: ID does not exist" Apr 16 18:12:23.578612 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.578547 2583 scope.go:117] "RemoveContainer" containerID="d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949" Apr 16 18:12:23.578772 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.578755 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949"} err="failed to get container status \"d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949\": rpc error: code = NotFound desc = could not find container \"d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949\": container with ID starting with d989b63100616a93673b485ef3ecfb76a594938bc3baff9be62df49b66a96949 not found: ID does not exist" Apr 16 18:12:23.578809 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.578772 2583 scope.go:117] "RemoveContainer" containerID="5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f" Apr 16 18:12:23.578954 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.578939 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f"} err="failed to get container status \"5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f\": rpc error: code = NotFound desc = could not find container \"5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f\": container with ID starting with 5a38b44fc8a4b1b542cd20b1ad068b6def16048b515e21dcd94ca41b58fedd6f not found: ID does not exist" Apr 16 18:12:23.578996 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.578955 2583 scope.go:117] "RemoveContainer" containerID="f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318" Apr 16 18:12:23.579141 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.579128 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318"} err="failed to get container status \"f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318\": rpc error: code = NotFound desc = could not find container \"f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318\": container with ID starting with f2111f496f98448ae97677992e5fb6132e6afbb6b126710686f54c8a7a670318 not found: ID does not exist" Apr 16 18:12:23.579177 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.579141 2583 scope.go:117] "RemoveContainer" containerID="4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505" Apr 16 18:12:23.579315 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.579300 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505"} err="failed to get container status \"4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505\": rpc error: code = NotFound desc = could not find container \"4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505\": container with ID starting with 4c19963b3f288f410461ae86f5ebbfd1b9132a17895e3e16f6fd6b005d9f0505 not found: ID does not exist" Apr 16 18:12:23.589942 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.589920 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:12:23.590214 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590180 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee9044a0-b743-4093-837d-ed37995d275b" containerName="console" Apr 16 18:12:23.590256 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590216 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9044a0-b743-4093-837d-ed37995d275b" containerName="console" Apr 16 18:12:23.590256 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590227 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="init-config-reloader" Apr 16 18:12:23.590256 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590233 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="init-config-reloader" Apr 16 18:12:23.590256 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590240 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="prom-label-proxy" Apr 16 18:12:23.590256 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590245 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="prom-label-proxy" Apr 16 18:12:23.590256 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590252 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d3db23b-0ff6-4a77-b594-d78e3a392d67" containerName="console" Apr 16 18:12:23.590256 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590257 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d3db23b-0ff6-4a77-b594-d78e3a392d67" containerName="console" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590262 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="config-reloader" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590269 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="config-reloader" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590276 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="kube-rbac-proxy" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590281 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="kube-rbac-proxy" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590294 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="kube-rbac-proxy-web" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590299 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="kube-rbac-proxy-web" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590306 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="kube-rbac-proxy-metric" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590311 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="kube-rbac-proxy-metric" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590319 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="alertmanager" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590323 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="alertmanager" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590363 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="prom-label-proxy" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590370 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="config-reloader" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590376 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="kube-rbac-proxy" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590383 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee9044a0-b743-4093-837d-ed37995d275b" containerName="console" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590389 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="alertmanager" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590394 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="kube-rbac-proxy-web" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590401 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d3db23b-0ff6-4a77-b594-d78e3a392d67" containerName="console" Apr 16 18:12:23.590447 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.590407 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" containerName="kube-rbac-proxy-metric" Apr 16 18:12:23.596759 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.596741 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.598981 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.598942 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:12:23.599099 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.598989 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:12:23.599099 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.599013 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:12:23.599099 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.599030 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:12:23.599328 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.599309 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-hnrnw\"" Apr 16 18:12:23.599424 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.599315 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:12:23.599424 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.599404 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:12:23.599625 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.599609 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:12:23.599711 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.599657 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:12:23.604781 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.604736 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:12:23.607168 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.607149 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:12:23.690770 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.690735 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-config-volume\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.690770 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.690783 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.691014 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.690844 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.691014 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.690892 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.691014 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.690920 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-web-config\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.691014 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.690946 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2c160745-48d9-4d31-823f-b47ca84ded99-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.691169 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.691023 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2c160745-48d9-4d31-823f-b47ca84ded99-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.691169 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.691052 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2c160745-48d9-4d31-823f-b47ca84ded99-config-out\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.691169 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.691073 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.691169 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.691099 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.691169 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.691124 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ds4v\" (UniqueName: \"kubernetes.io/projected/2c160745-48d9-4d31-823f-b47ca84ded99-kube-api-access-9ds4v\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.691169 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.691164 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c160745-48d9-4d31-823f-b47ca84ded99-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.691353 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.691202 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c160745-48d9-4d31-823f-b47ca84ded99-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.791986 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.791905 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.791986 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.791944 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.791986 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.791965 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ds4v\" (UniqueName: \"kubernetes.io/projected/2c160745-48d9-4d31-823f-b47ca84ded99-kube-api-access-9ds4v\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.791986 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.791983 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c160745-48d9-4d31-823f-b47ca84ded99-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.792268 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.792032 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c160745-48d9-4d31-823f-b47ca84ded99-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.792268 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.792073 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-config-volume\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.792268 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.792095 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.792268 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.792114 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.792268 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.792139 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.792268 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.792168 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-web-config\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.792268 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.792203 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2c160745-48d9-4d31-823f-b47ca84ded99-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.792268 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.792250 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2c160745-48d9-4d31-823f-b47ca84ded99-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.792684 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.792277 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2c160745-48d9-4d31-823f-b47ca84ded99-config-out\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.793219 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.792747 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c160745-48d9-4d31-823f-b47ca84ded99-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.793753 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.793398 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c160745-48d9-4d31-823f-b47ca84ded99-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.795053 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.795009 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2c160745-48d9-4d31-823f-b47ca84ded99-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.795221 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.795179 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.795539 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.795519 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2c160745-48d9-4d31-823f-b47ca84ded99-config-out\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.795741 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.795667 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.795926 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.795871 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-web-config\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.795991 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.795922 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.796201 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.796178 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-config-volume\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.796307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.796284 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.796885 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.796858 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2c160745-48d9-4d31-823f-b47ca84ded99-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.797572 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.797553 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2c160745-48d9-4d31-823f-b47ca84ded99-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.800152 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.800134 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ds4v\" (UniqueName: \"kubernetes.io/projected/2c160745-48d9-4d31-823f-b47ca84ded99-kube-api-access-9ds4v\") pod \"alertmanager-main-0\" (UID: \"2c160745-48d9-4d31-823f-b47ca84ded99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.907600 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.907551 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:12:23.993632 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:23.993604 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70" path="/var/lib/kubelet/pods/7b14e9cb-a8aa-41ea-8a56-efa56a9eaa70/volumes" Apr 16 18:12:24.046808 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:24.046598 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:12:24.049339 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:12:24.049312 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c160745_48d9_4d31_823f_b47ca84ded99.slice/crio-3eba8c6fcb76418caddb7f3da682dec3c123275b22e1400b5333055bdd55b083 WatchSource:0}: Error finding container 3eba8c6fcb76418caddb7f3da682dec3c123275b22e1400b5333055bdd55b083: Status 404 returned error can't find the container with id 3eba8c6fcb76418caddb7f3da682dec3c123275b22e1400b5333055bdd55b083 Apr 16 18:12:24.531135 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:24.531105 2583 generic.go:358] "Generic (PLEG): container finished" podID="2c160745-48d9-4d31-823f-b47ca84ded99" containerID="16a2a07e08666f187ed16b83f3b619ee1c3377828e022fbd519150f1ae8eda7b" exitCode=0 Apr 16 18:12:24.531539 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:24.531193 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2c160745-48d9-4d31-823f-b47ca84ded99","Type":"ContainerDied","Data":"16a2a07e08666f187ed16b83f3b619ee1c3377828e022fbd519150f1ae8eda7b"} Apr 16 18:12:24.531539 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:24.531235 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2c160745-48d9-4d31-823f-b47ca84ded99","Type":"ContainerStarted","Data":"3eba8c6fcb76418caddb7f3da682dec3c123275b22e1400b5333055bdd55b083"} Apr 16 18:12:25.490168 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.490113 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6b488c7c54-zh9bb"] Apr 16 18:12:25.493506 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.493479 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.496130 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.496105 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 18:12:25.496243 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.496130 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 18:12:25.496243 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.496211 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 18:12:25.498337 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.498316 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 18:12:25.498462 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.498380 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 18:12:25.498462 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.498439 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-b76xq\"" Apr 16 18:12:25.505005 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.504157 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 18:12:25.506817 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.506793 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6b488c7c54-zh9bb"] Apr 16 18:12:25.540106 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.540070 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2c160745-48d9-4d31-823f-b47ca84ded99","Type":"ContainerStarted","Data":"59067898dbd98a8859f1e3af55dcc35715de89f0d991824dc7d4275f58509c32"} Apr 16 18:12:25.540542 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.540113 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2c160745-48d9-4d31-823f-b47ca84ded99","Type":"ContainerStarted","Data":"3cac6eb716c02170a52aa5746a0213c3a7fe1af59c48482ec6975c3f9fc21b58"} Apr 16 18:12:25.540542 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.540129 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2c160745-48d9-4d31-823f-b47ca84ded99","Type":"ContainerStarted","Data":"dc17d77adeac1aadd56c0e40aa8f87feebd200db3e692fc0d9c9a28f15a5953c"} Apr 16 18:12:25.540542 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.540141 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2c160745-48d9-4d31-823f-b47ca84ded99","Type":"ContainerStarted","Data":"25506f825abcf281623d28b8b55afe3343b88b57750132adb5bf6c79d46c6db9"} Apr 16 18:12:25.540542 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.540154 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2c160745-48d9-4d31-823f-b47ca84ded99","Type":"ContainerStarted","Data":"238d02626e5ffc5de57dbcc86fef9e6e06f6a50c9dec6701849fcdac87d6a301"} Apr 16 18:12:25.540542 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.540166 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2c160745-48d9-4d31-823f-b47ca84ded99","Type":"ContainerStarted","Data":"39d9c72ea727926aa094e324234f7fdbf936b0a5f6b2a16092011e25cdbe81f4"} Apr 16 18:12:25.565098 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.565041 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.565023407 podStartE2EDuration="2.565023407s" podCreationTimestamp="2026-04-16 18:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:12:25.564699677 +0000 UTC m=+146.153897977" watchObservedRunningTime="2026-04-16 18:12:25.565023407 +0000 UTC m=+146.154221693" Apr 16 18:12:25.605465 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.605418 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-federate-client-tls\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.605671 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.605490 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-secret-telemeter-client\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.605671 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.605609 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.605782 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.605710 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-telemeter-client-tls\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.605782 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.605736 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztmb9\" (UniqueName: \"kubernetes.io/projected/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-kube-api-access-ztmb9\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.605782 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.605775 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-serving-certs-ca-bundle\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.605922 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.605819 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-metrics-client-ca\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.605922 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.605894 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.707017 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.706984 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.707017 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.707028 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-federate-client-tls\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.707285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.707068 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-secret-telemeter-client\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.707285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.707117 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.707285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.707159 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-telemeter-client-tls\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.707285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.707184 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztmb9\" (UniqueName: \"kubernetes.io/projected/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-kube-api-access-ztmb9\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.707285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.707236 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-serving-certs-ca-bundle\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.707285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.707274 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-metrics-client-ca\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.708213 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.708156 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-metrics-client-ca\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.708472 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.708441 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.709013 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.708986 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-serving-certs-ca-bundle\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.710053 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.709990 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.710053 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.709993 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-telemeter-client-tls\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.710240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.710220 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-secret-telemeter-client\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.711610 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.710336 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-federate-client-tls\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.717180 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.717156 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztmb9\" (UniqueName: \"kubernetes.io/projected/dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c-kube-api-access-ztmb9\") pod \"telemeter-client-6b488c7c54-zh9bb\" (UID: \"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c\") " pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.797495 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.797391 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:12:25.797952 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.797908 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="prometheus" containerID="cri-o://7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007" gracePeriod=600 Apr 16 18:12:25.797952 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.797941 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="thanos-sidecar" containerID="cri-o://fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d" gracePeriod=600 Apr 16 18:12:25.798148 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.797960 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="kube-rbac-proxy-web" containerID="cri-o://2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804" gracePeriod=600 Apr 16 18:12:25.798148 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.797998 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="config-reloader" containerID="cri-o://10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32" gracePeriod=600 Apr 16 18:12:25.798148 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.797936 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="kube-rbac-proxy" containerID="cri-o://2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771" gracePeriod=600 Apr 16 18:12:25.798148 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.797967 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="kube-rbac-proxy-thanos" containerID="cri-o://24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac" gracePeriod=600 Apr 16 18:12:25.805374 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.804833 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" Apr 16 18:12:25.969302 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:25.969232 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6b488c7c54-zh9bb"] Apr 16 18:12:25.973969 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:12:25.973932 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc3eddbc_d2f4_4ce2_8cd0_8e12366a208c.slice/crio-2c845ea5cb552420fe98dcd6991a3f3ed7e3f89b75c6df276f4b753f85a8c612 WatchSource:0}: Error finding container 2c845ea5cb552420fe98dcd6991a3f3ed7e3f89b75c6df276f4b753f85a8c612: Status 404 returned error can't find the container with id 2c845ea5cb552420fe98dcd6991a3f3ed7e3f89b75c6df276f4b753f85a8c612 Apr 16 18:12:26.067421 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.067398 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.110436 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110406 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.110614 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110453 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-serving-certs-ca-bundle\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.110614 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110473 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-web-config\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.110614 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110501 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-trusted-ca-bundle\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.110816 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110623 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-k8s-db\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.110816 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110668 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-kubelet-serving-ca-bundle\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.110816 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110702 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-tls\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.110816 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110745 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-tls-assets\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.110816 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110779 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-metrics-client-certs\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.110816 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110806 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-config-out\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.111120 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110837 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-grpc-tls\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.111120 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110864 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:26.111120 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110871 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:26.111120 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110891 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlr6h\" (UniqueName: \"kubernetes.io/projected/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-kube-api-access-vlr6h\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.111120 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110932 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.111120 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110957 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-k8s-rulefiles-0\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.111120 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.110996 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-config\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.111120 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.111025 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-metrics-client-ca\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.111120 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.111071 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-kube-rbac-proxy\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.111120 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.111101 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-thanos-prometheus-http-client-file\") pod \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\" (UID: \"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc\") " Apr 16 18:12:26.111626 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.111332 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.111626 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.111352 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-trusted-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.113957 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.112897 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:26.113957 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.113231 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:26.114163 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.113959 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:26.114163 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.114004 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-config-out" (OuterVolumeSpecName: "config-out") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:26.114651 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.114366 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:26.114651 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.114447 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:12:26.116061 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.115831 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:26.117840 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.117773 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:26.119175 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.119138 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-kube-api-access-vlr6h" (OuterVolumeSpecName: "kube-api-access-vlr6h") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "kube-api-access-vlr6h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:26.119621 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.119547 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:26.119884 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.119846 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:26.119884 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.119867 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-config" (OuterVolumeSpecName: "config") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:26.120130 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.120081 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:26.120225 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.120188 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:26.121010 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.120975 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:26.131022 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.130993 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-web-config" (OuterVolumeSpecName: "web-config") pod "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" (UID: "96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:26.212210 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212182 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vlr6h\" (UniqueName: \"kubernetes.io/projected/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-kube-api-access-vlr6h\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212210 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212209 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212219 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212229 2583 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-config\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212238 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-metrics-client-ca\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212248 2583 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-kube-rbac-proxy\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212257 2583 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-thanos-prometheus-http-client-file\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212267 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212276 2583 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-web-config\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212284 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-prometheus-k8s-db\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212293 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212303 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-prometheus-k8s-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212313 2583 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-tls-assets\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212323 2583 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-metrics-client-certs\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212330 2583 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-config-out\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.212395 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.212339 2583 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc-secret-grpc-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:12:26.546790 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.546759 2583 generic.go:358] "Generic (PLEG): container finished" podID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerID="24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac" exitCode=0 Apr 16 18:12:26.546790 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.546788 2583 generic.go:358] "Generic (PLEG): container finished" podID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerID="2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771" exitCode=0 Apr 16 18:12:26.547240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.546796 2583 generic.go:358] "Generic (PLEG): container finished" podID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerID="2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804" exitCode=0 Apr 16 18:12:26.547240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.546806 2583 generic.go:358] "Generic (PLEG): container finished" podID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerID="fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d" exitCode=0 Apr 16 18:12:26.547240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.546813 2583 generic.go:358] "Generic (PLEG): container finished" podID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerID="10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32" exitCode=0 Apr 16 18:12:26.547240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.546820 2583 generic.go:358] "Generic (PLEG): container finished" podID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerID="7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007" exitCode=0 Apr 16 18:12:26.547240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.546883 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.547240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.546881 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerDied","Data":"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac"} Apr 16 18:12:26.547240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.547009 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerDied","Data":"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771"} Apr 16 18:12:26.547240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.547029 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerDied","Data":"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804"} Apr 16 18:12:26.547240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.547043 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerDied","Data":"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d"} Apr 16 18:12:26.547240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.547058 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerDied","Data":"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32"} Apr 16 18:12:26.547240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.547079 2583 scope.go:117] "RemoveContainer" containerID="24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac" Apr 16 18:12:26.548725 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.548693 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerDied","Data":"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007"} Apr 16 18:12:26.548853 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.548747 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc","Type":"ContainerDied","Data":"20a205898920a9478b60b2a0d53956afd19fde374a080ba991b9627052cd77e4"} Apr 16 18:12:26.556147 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.556116 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" event={"ID":"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c","Type":"ContainerStarted","Data":"2c845ea5cb552420fe98dcd6991a3f3ed7e3f89b75c6df276f4b753f85a8c612"} Apr 16 18:12:26.563027 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.562989 2583 scope.go:117] "RemoveContainer" containerID="2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771" Apr 16 18:12:26.571413 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.571394 2583 scope.go:117] "RemoveContainer" containerID="2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804" Apr 16 18:12:26.579078 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.579018 2583 scope.go:117] "RemoveContainer" containerID="fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d" Apr 16 18:12:26.582270 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.582228 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:12:26.587471 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.587445 2583 scope.go:117] "RemoveContainer" containerID="10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32" Apr 16 18:12:26.587852 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.587830 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:12:26.595086 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.595063 2583 scope.go:117] "RemoveContainer" containerID="7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007" Apr 16 18:12:26.602407 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.602386 2583 scope.go:117] "RemoveContainer" containerID="f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244" Apr 16 18:12:26.610205 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.610187 2583 scope.go:117] "RemoveContainer" containerID="24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac" Apr 16 18:12:26.610508 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:26.610458 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac\": container with ID starting with 24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac not found: ID does not exist" containerID="24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac" Apr 16 18:12:26.610622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.610497 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac"} err="failed to get container status \"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac\": rpc error: code = NotFound desc = could not find container \"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac\": container with ID starting with 24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac not found: ID does not exist" Apr 16 18:12:26.610622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.610524 2583 scope.go:117] "RemoveContainer" containerID="2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771" Apr 16 18:12:26.610973 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:26.610946 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771\": container with ID starting with 2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771 not found: ID does not exist" containerID="2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771" Apr 16 18:12:26.611048 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.610982 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771"} err="failed to get container status \"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771\": rpc error: code = NotFound desc = could not find container \"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771\": container with ID starting with 2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771 not found: ID does not exist" Apr 16 18:12:26.611048 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.611006 2583 scope.go:117] "RemoveContainer" containerID="2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804" Apr 16 18:12:26.611338 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:26.611320 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804\": container with ID starting with 2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804 not found: ID does not exist" containerID="2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804" Apr 16 18:12:26.611416 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.611342 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804"} err="failed to get container status \"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804\": rpc error: code = NotFound desc = could not find container \"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804\": container with ID starting with 2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804 not found: ID does not exist" Apr 16 18:12:26.611416 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.611364 2583 scope.go:117] "RemoveContainer" containerID="fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d" Apr 16 18:12:26.611618 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:26.611598 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d\": container with ID starting with fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d not found: ID does not exist" containerID="fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d" Apr 16 18:12:26.611741 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.611714 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d"} err="failed to get container status \"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d\": rpc error: code = NotFound desc = could not find container \"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d\": container with ID starting with fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d not found: ID does not exist" Apr 16 18:12:26.611741 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.611742 2583 scope.go:117] "RemoveContainer" containerID="10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32" Apr 16 18:12:26.612000 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:26.611981 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32\": container with ID starting with 10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32 not found: ID does not exist" containerID="10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32" Apr 16 18:12:26.612066 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.612006 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32"} err="failed to get container status \"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32\": rpc error: code = NotFound desc = could not find container \"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32\": container with ID starting with 10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32 not found: ID does not exist" Apr 16 18:12:26.612066 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.612026 2583 scope.go:117] "RemoveContainer" containerID="7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007" Apr 16 18:12:26.612295 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:26.612277 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007\": container with ID starting with 7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007 not found: ID does not exist" containerID="7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007" Apr 16 18:12:26.612377 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.612298 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007"} err="failed to get container status \"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007\": rpc error: code = NotFound desc = could not find container \"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007\": container with ID starting with 7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007 not found: ID does not exist" Apr 16 18:12:26.612377 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.612311 2583 scope.go:117] "RemoveContainer" containerID="f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244" Apr 16 18:12:26.612641 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:12:26.612616 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244\": container with ID starting with f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244 not found: ID does not exist" containerID="f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244" Apr 16 18:12:26.612743 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.612715 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244"} err="failed to get container status \"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244\": rpc error: code = NotFound desc = could not find container \"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244\": container with ID starting with f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244 not found: ID does not exist" Apr 16 18:12:26.612743 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.612739 2583 scope.go:117] "RemoveContainer" containerID="24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac" Apr 16 18:12:26.613087 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.613061 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac"} err="failed to get container status \"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac\": rpc error: code = NotFound desc = could not find container \"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac\": container with ID starting with 24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac not found: ID does not exist" Apr 16 18:12:26.613177 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.613090 2583 scope.go:117] "RemoveContainer" containerID="2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771" Apr 16 18:12:26.613359 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.613335 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771"} err="failed to get container status \"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771\": rpc error: code = NotFound desc = could not find container \"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771\": container with ID starting with 2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771 not found: ID does not exist" Apr 16 18:12:26.613424 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.613360 2583 scope.go:117] "RemoveContainer" containerID="2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804" Apr 16 18:12:26.613604 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.613561 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804"} err="failed to get container status \"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804\": rpc error: code = NotFound desc = could not find container \"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804\": container with ID starting with 2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804 not found: ID does not exist" Apr 16 18:12:26.613702 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.613605 2583 scope.go:117] "RemoveContainer" containerID="fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d" Apr 16 18:12:26.613937 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.613808 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d"} err="failed to get container status \"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d\": rpc error: code = NotFound desc = could not find container \"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d\": container with ID starting with fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d not found: ID does not exist" Apr 16 18:12:26.613937 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.613832 2583 scope.go:117] "RemoveContainer" containerID="10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32" Apr 16 18:12:26.614284 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.614111 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32"} err="failed to get container status \"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32\": rpc error: code = NotFound desc = could not find container \"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32\": container with ID starting with 10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32 not found: ID does not exist" Apr 16 18:12:26.614284 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.614135 2583 scope.go:117] "RemoveContainer" containerID="7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007" Apr 16 18:12:26.614510 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.614491 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007"} err="failed to get container status \"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007\": rpc error: code = NotFound desc = could not find container \"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007\": container with ID starting with 7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007 not found: ID does not exist" Apr 16 18:12:26.614615 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.614512 2583 scope.go:117] "RemoveContainer" containerID="f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244" Apr 16 18:12:26.614791 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.614761 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244"} err="failed to get container status \"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244\": rpc error: code = NotFound desc = could not find container \"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244\": container with ID starting with f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244 not found: ID does not exist" Apr 16 18:12:26.614791 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.614789 2583 scope.go:117] "RemoveContainer" containerID="24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac" Apr 16 18:12:26.615033 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.615001 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac"} err="failed to get container status \"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac\": rpc error: code = NotFound desc = could not find container \"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac\": container with ID starting with 24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac not found: ID does not exist" Apr 16 18:12:26.615110 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.615036 2583 scope.go:117] "RemoveContainer" containerID="2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771" Apr 16 18:12:26.615274 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.615244 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771"} err="failed to get container status \"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771\": rpc error: code = NotFound desc = could not find container \"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771\": container with ID starting with 2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771 not found: ID does not exist" Apr 16 18:12:26.615318 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.615277 2583 scope.go:117] "RemoveContainer" containerID="2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804" Apr 16 18:12:26.615506 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.615484 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804"} err="failed to get container status \"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804\": rpc error: code = NotFound desc = could not find container \"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804\": container with ID starting with 2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804 not found: ID does not exist" Apr 16 18:12:26.615598 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.615508 2583 scope.go:117] "RemoveContainer" containerID="fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d" Apr 16 18:12:26.615811 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.615785 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d"} err="failed to get container status \"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d\": rpc error: code = NotFound desc = could not find container \"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d\": container with ID starting with fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d not found: ID does not exist" Apr 16 18:12:26.615811 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.615811 2583 scope.go:117] "RemoveContainer" containerID="10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32" Apr 16 18:12:26.616311 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.616052 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32"} err="failed to get container status \"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32\": rpc error: code = NotFound desc = could not find container \"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32\": container with ID starting with 10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32 not found: ID does not exist" Apr 16 18:12:26.616311 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.616085 2583 scope.go:117] "RemoveContainer" containerID="7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007" Apr 16 18:12:26.616653 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.616632 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007"} err="failed to get container status \"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007\": rpc error: code = NotFound desc = could not find container \"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007\": container with ID starting with 7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007 not found: ID does not exist" Apr 16 18:12:26.616653 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.616652 2583 scope.go:117] "RemoveContainer" containerID="f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244" Apr 16 18:12:26.616985 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.616959 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244"} err="failed to get container status \"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244\": rpc error: code = NotFound desc = could not find container \"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244\": container with ID starting with f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244 not found: ID does not exist" Apr 16 18:12:26.617081 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.616985 2583 scope.go:117] "RemoveContainer" containerID="24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac" Apr 16 18:12:26.617252 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.617223 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac"} err="failed to get container status \"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac\": rpc error: code = NotFound desc = could not find container \"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac\": container with ID starting with 24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac not found: ID does not exist" Apr 16 18:12:26.617319 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.617254 2583 scope.go:117] "RemoveContainer" containerID="2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771" Apr 16 18:12:26.617534 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.617506 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771"} err="failed to get container status \"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771\": rpc error: code = NotFound desc = could not find container \"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771\": container with ID starting with 2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771 not found: ID does not exist" Apr 16 18:12:26.617534 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.617537 2583 scope.go:117] "RemoveContainer" containerID="2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804" Apr 16 18:12:26.617835 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.617815 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804"} err="failed to get container status \"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804\": rpc error: code = NotFound desc = could not find container \"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804\": container with ID starting with 2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804 not found: ID does not exist" Apr 16 18:12:26.617885 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.617837 2583 scope.go:117] "RemoveContainer" containerID="fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d" Apr 16 18:12:26.618074 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.618042 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d"} err="failed to get container status \"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d\": rpc error: code = NotFound desc = could not find container \"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d\": container with ID starting with fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d not found: ID does not exist" Apr 16 18:12:26.618074 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.618065 2583 scope.go:117] "RemoveContainer" containerID="10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32" Apr 16 18:12:26.618297 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.618279 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32"} err="failed to get container status \"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32\": rpc error: code = NotFound desc = could not find container \"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32\": container with ID starting with 10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32 not found: ID does not exist" Apr 16 18:12:26.618355 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.618299 2583 scope.go:117] "RemoveContainer" containerID="7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007" Apr 16 18:12:26.618541 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.618518 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007"} err="failed to get container status \"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007\": rpc error: code = NotFound desc = could not find container \"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007\": container with ID starting with 7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007 not found: ID does not exist" Apr 16 18:12:26.618630 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.618544 2583 scope.go:117] "RemoveContainer" containerID="f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244" Apr 16 18:12:26.618852 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.618821 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244"} err="failed to get container status \"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244\": rpc error: code = NotFound desc = could not find container \"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244\": container with ID starting with f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244 not found: ID does not exist" Apr 16 18:12:26.618938 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.618854 2583 scope.go:117] "RemoveContainer" containerID="24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac" Apr 16 18:12:26.619161 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.619140 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac"} err="failed to get container status \"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac\": rpc error: code = NotFound desc = could not find container \"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac\": container with ID starting with 24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac not found: ID does not exist" Apr 16 18:12:26.619224 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.619162 2583 scope.go:117] "RemoveContainer" containerID="2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771" Apr 16 18:12:26.619404 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.619383 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771"} err="failed to get container status \"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771\": rpc error: code = NotFound desc = could not find container \"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771\": container with ID starting with 2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771 not found: ID does not exist" Apr 16 18:12:26.619458 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.619407 2583 scope.go:117] "RemoveContainer" containerID="2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804" Apr 16 18:12:26.619664 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.619644 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804"} err="failed to get container status \"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804\": rpc error: code = NotFound desc = could not find container \"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804\": container with ID starting with 2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804 not found: ID does not exist" Apr 16 18:12:26.619728 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.619665 2583 scope.go:117] "RemoveContainer" containerID="fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d" Apr 16 18:12:26.619921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.619898 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d"} err="failed to get container status \"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d\": rpc error: code = NotFound desc = could not find container \"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d\": container with ID starting with fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d not found: ID does not exist" Apr 16 18:12:26.619973 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.619924 2583 scope.go:117] "RemoveContainer" containerID="10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32" Apr 16 18:12:26.620186 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.620166 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32"} err="failed to get container status \"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32\": rpc error: code = NotFound desc = could not find container \"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32\": container with ID starting with 10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32 not found: ID does not exist" Apr 16 18:12:26.620186 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.620185 2583 scope.go:117] "RemoveContainer" containerID="7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007" Apr 16 18:12:26.620404 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.620386 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007"} err="failed to get container status \"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007\": rpc error: code = NotFound desc = could not find container \"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007\": container with ID starting with 7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007 not found: ID does not exist" Apr 16 18:12:26.620404 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.620404 2583 scope.go:117] "RemoveContainer" containerID="f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244" Apr 16 18:12:26.620639 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.620616 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244"} err="failed to get container status \"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244\": rpc error: code = NotFound desc = could not find container \"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244\": container with ID starting with f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244 not found: ID does not exist" Apr 16 18:12:26.620718 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.620640 2583 scope.go:117] "RemoveContainer" containerID="24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac" Apr 16 18:12:26.620835 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.620807 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac"} err="failed to get container status \"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac\": rpc error: code = NotFound desc = could not find container \"24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac\": container with ID starting with 24b16d18fea78bcadffd04c8f5f23d5983fb4be76832d0c5895c750f144b9cac not found: ID does not exist" Apr 16 18:12:26.620835 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.620836 2583 scope.go:117] "RemoveContainer" containerID="2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771" Apr 16 18:12:26.621083 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.621066 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771"} err="failed to get container status \"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771\": rpc error: code = NotFound desc = could not find container \"2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771\": container with ID starting with 2019c183c17c0a42f825723c44a2ce90ad090ddea3a8b8b35db741469baf8771 not found: ID does not exist" Apr 16 18:12:26.621160 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.621085 2583 scope.go:117] "RemoveContainer" containerID="2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804" Apr 16 18:12:26.621314 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.621295 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804"} err="failed to get container status \"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804\": rpc error: code = NotFound desc = could not find container \"2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804\": container with ID starting with 2b2849c40b254e9a1368bdf94b6d6cc8ad1b7153af10aa5354949478db195804 not found: ID does not exist" Apr 16 18:12:26.621362 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.621316 2583 scope.go:117] "RemoveContainer" containerID="fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d" Apr 16 18:12:26.621591 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.621548 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d"} err="failed to get container status \"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d\": rpc error: code = NotFound desc = could not find container \"fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d\": container with ID starting with fba88812042160fbec95ca5f66b2c23cd4c51d7e969143e84da6f41feed4084d not found: ID does not exist" Apr 16 18:12:26.621673 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.621603 2583 scope.go:117] "RemoveContainer" containerID="10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32" Apr 16 18:12:26.621872 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.621830 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32"} err="failed to get container status \"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32\": rpc error: code = NotFound desc = could not find container \"10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32\": container with ID starting with 10e1d9ea762288fe46de882e601461b7bb24f04c00a3a02adda1f8af25908c32 not found: ID does not exist" Apr 16 18:12:26.621872 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.621862 2583 scope.go:117] "RemoveContainer" containerID="7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007" Apr 16 18:12:26.622075 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.622055 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007"} err="failed to get container status \"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007\": rpc error: code = NotFound desc = could not find container \"7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007\": container with ID starting with 7bb9e0f6a176b46708aa0cd72597ba33282f5daf691f47c90b7199ce853b0007 not found: ID does not exist" Apr 16 18:12:26.622142 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.622078 2583 scope.go:117] "RemoveContainer" containerID="f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244" Apr 16 18:12:26.622379 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.622353 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244"} err="failed to get container status \"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244\": rpc error: code = NotFound desc = could not find container \"f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244\": container with ID starting with f3a1d3b46ba415afeb468f23d8ef982150aabb944b957b7abbce1582fffce244 not found: ID does not exist" Apr 16 18:12:26.630808 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.630787 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:12:26.631130 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631111 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="thanos-sidecar" Apr 16 18:12:26.631130 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631133 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="thanos-sidecar" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631150 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="kube-rbac-proxy" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631160 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="kube-rbac-proxy" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631171 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="config-reloader" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631180 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="config-reloader" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631197 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="kube-rbac-proxy-thanos" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631205 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="kube-rbac-proxy-thanos" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631215 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="init-config-reloader" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631223 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="init-config-reloader" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631231 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="kube-rbac-proxy-web" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631239 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="kube-rbac-proxy-web" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631250 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="prometheus" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631257 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="prometheus" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631302 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="prometheus" Apr 16 18:12:26.631307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631312 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="config-reloader" Apr 16 18:12:26.631974 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631322 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="kube-rbac-proxy" Apr 16 18:12:26.631974 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631332 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="thanos-sidecar" Apr 16 18:12:26.631974 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631342 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="kube-rbac-proxy-thanos" Apr 16 18:12:26.631974 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.631352 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" containerName="kube-rbac-proxy-web" Apr 16 18:12:26.636720 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.636690 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.645917 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.645900 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:12:26.646193 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.646168 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:12:26.646462 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.646421 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-6c99d\"" Apr 16 18:12:26.646700 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.646682 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:12:26.646747 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.646708 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:12:26.647022 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.646997 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:12:26.647098 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.647033 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:12:26.647293 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.647278 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:12:26.647361 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.647292 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:12:26.647361 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.647334 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:12:26.647744 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.647720 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:12:26.647926 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.647909 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4ecq72jksclj1\"" Apr 16 18:12:26.648829 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.648792 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:12:26.648961 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.648943 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:12:26.652469 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.652446 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:12:26.659780 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.659744 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:12:26.716903 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.716864 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.716903 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.716910 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717138 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.716939 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717138 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.716985 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717138 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717028 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717138 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717063 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717138 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717081 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717138 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717100 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717417 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717157 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-config\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717417 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717193 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-web-config\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717417 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717229 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717417 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717291 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717417 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717340 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717417 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717400 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717687 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717438 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717687 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717462 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrfpl\" (UniqueName: \"kubernetes.io/projected/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-kube-api-access-rrfpl\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717687 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717565 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.717687 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.717616 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-config-out\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.818548 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.818464 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.818548 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.818504 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.818548 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.818525 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.818839 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.818702 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.818839 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.818754 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.818839 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.818791 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.818839 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.818822 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819033 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.818855 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819033 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.818885 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-config\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819033 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.818915 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-web-config\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819033 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.818939 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819033 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.818943 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819033 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.818974 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.819052 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.819097 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.819130 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.819161 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrfpl\" (UniqueName: \"kubernetes.io/projected/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-kube-api-access-rrfpl\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.819193 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.819215 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-config-out\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819621 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.819335 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.819869 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.819849 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.821481 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.821088 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.822339 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.822217 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.822439 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.822412 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.823124 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.822660 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.823124 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.822689 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-web-config\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.823124 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.822984 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.823124 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.823012 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-config-out\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.824005 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.823950 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.824112 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.824013 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-config\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.824112 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.824079 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.825312 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.825288 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.825492 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.825468 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.825641 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.825491 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.826282 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.826255 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.831870 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.831844 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrfpl\" (UniqueName: \"kubernetes.io/projected/a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46-kube-api-access-rrfpl\") pod \"prometheus-k8s-0\" (UID: \"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:26.948209 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:26.948167 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:12:27.091976 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:27.091892 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:12:27.096206 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:12:27.096174 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda16dd21a_5ad7_40e1_8bbb_fa745f3f4c46.slice/crio-c31b7721b0171d366c233cba1933972070130eee4e5da52a33a7593c6dd0e4f5 WatchSource:0}: Error finding container c31b7721b0171d366c233cba1933972070130eee4e5da52a33a7593c6dd0e4f5: Status 404 returned error can't find the container with id c31b7721b0171d366c233cba1933972070130eee4e5da52a33a7593c6dd0e4f5 Apr 16 18:12:27.561743 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:27.561558 2583 generic.go:358] "Generic (PLEG): container finished" podID="a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46" containerID="623424fa215a6eb69b20ee53a7a5d6b78a95dd4e4e6258e3f3e7f1ab80d1f288" exitCode=0 Apr 16 18:12:27.561743 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:27.561616 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46","Type":"ContainerDied","Data":"623424fa215a6eb69b20ee53a7a5d6b78a95dd4e4e6258e3f3e7f1ab80d1f288"} Apr 16 18:12:27.561743 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:27.561657 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46","Type":"ContainerStarted","Data":"c31b7721b0171d366c233cba1933972070130eee4e5da52a33a7593c6dd0e4f5"} Apr 16 18:12:27.992830 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:27.992798 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc" path="/var/lib/kubelet/pods/96c7ac7a-2ba8-4e49-8ff0-bc8f99c761bc/volumes" Apr 16 18:12:28.568634 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:28.568573 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" event={"ID":"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c","Type":"ContainerStarted","Data":"aa878f2f509dee8e7522c70f681ad800a1fcabda57d241e7f8e214270ca931e5"} Apr 16 18:12:28.568634 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:28.568636 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" event={"ID":"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c","Type":"ContainerStarted","Data":"258f08bb917f54ffde584ea79253183dbee25ffba0f3f9b53dc420e1b659becd"} Apr 16 18:12:28.569131 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:28.568648 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" event={"ID":"dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c","Type":"ContainerStarted","Data":"8486777ea37045875cc2219fef2f4a260f37174720de071da1e6753e0e77cd60"} Apr 16 18:12:28.571340 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:28.571314 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46","Type":"ContainerStarted","Data":"727248a4a796b76886eb0696950e4d07d0ea2816e324426121922672d86c3735"} Apr 16 18:12:28.571458 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:28.571346 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46","Type":"ContainerStarted","Data":"8cdd3f53715de8d703766e28ae6962f5df8578625449010f1a7581f2106425fb"} Apr 16 18:12:28.571458 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:28.571355 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46","Type":"ContainerStarted","Data":"9c625ae89b48a2b663901ae355458cd569b50119ca88ab882e2ef2d74fa9532e"} Apr 16 18:12:28.571458 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:28.571363 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46","Type":"ContainerStarted","Data":"ebb9b73f95a72f2ce6d1704b4ea4431a5433a7a96f8be50b3299c879ba0e9b89"} Apr 16 18:12:28.571458 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:28.571371 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46","Type":"ContainerStarted","Data":"7fe47f36b856dfd21ed87a32d2b71f8235ca9a4f761422fba447e7642c40ff3d"} Apr 16 18:12:28.571458 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:28.571379 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46","Type":"ContainerStarted","Data":"0becc03e91694a10dda48d7c089494aac39369809a3acd5475afd24b90edb3aa"} Apr 16 18:12:28.593079 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:28.593024 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6b488c7c54-zh9bb" podStartSLOduration=1.7649338719999998 podStartE2EDuration="3.593011817s" podCreationTimestamp="2026-04-16 18:12:25 +0000 UTC" firstStartedPulling="2026-04-16 18:12:25.975939504 +0000 UTC m=+146.565137767" lastFinishedPulling="2026-04-16 18:12:27.804017447 +0000 UTC m=+148.393215712" observedRunningTime="2026-04-16 18:12:28.590922619 +0000 UTC m=+149.180120893" watchObservedRunningTime="2026-04-16 18:12:28.593011817 +0000 UTC m=+149.182210103" Apr 16 18:12:28.617506 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:28.617454 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.617440948 podStartE2EDuration="2.617440948s" podCreationTimestamp="2026-04-16 18:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:12:28.61620026 +0000 UTC m=+149.205398570" watchObservedRunningTime="2026-04-16 18:12:28.617440948 +0000 UTC m=+149.206639291" Apr 16 18:12:31.948494 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:12:31.948447 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:26.948655 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:13:26.948617 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:26.964377 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:13:26.964346 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:13:27.760324 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:13:27.760296 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:14:16.136162 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.136082 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-569hm"] Apr 16 18:14:16.138285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.138270 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-569hm" Apr 16 18:14:16.140225 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.140199 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:14:16.146267 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.146242 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-569hm"] Apr 16 18:14:16.202686 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.202641 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5a325d12-fc69-40c7-a5dd-aa1bb836aedd-original-pull-secret\") pod \"global-pull-secret-syncer-569hm\" (UID: \"5a325d12-fc69-40c7-a5dd-aa1bb836aedd\") " pod="kube-system/global-pull-secret-syncer-569hm" Apr 16 18:14:16.202686 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.202683 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5a325d12-fc69-40c7-a5dd-aa1bb836aedd-dbus\") pod \"global-pull-secret-syncer-569hm\" (UID: \"5a325d12-fc69-40c7-a5dd-aa1bb836aedd\") " pod="kube-system/global-pull-secret-syncer-569hm" Apr 16 18:14:16.202954 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.202790 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5a325d12-fc69-40c7-a5dd-aa1bb836aedd-kubelet-config\") pod \"global-pull-secret-syncer-569hm\" (UID: \"5a325d12-fc69-40c7-a5dd-aa1bb836aedd\") " pod="kube-system/global-pull-secret-syncer-569hm" Apr 16 18:14:16.303779 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.303722 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5a325d12-fc69-40c7-a5dd-aa1bb836aedd-original-pull-secret\") pod \"global-pull-secret-syncer-569hm\" (UID: \"5a325d12-fc69-40c7-a5dd-aa1bb836aedd\") " pod="kube-system/global-pull-secret-syncer-569hm" Apr 16 18:14:16.303779 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.303777 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5a325d12-fc69-40c7-a5dd-aa1bb836aedd-dbus\") pod \"global-pull-secret-syncer-569hm\" (UID: \"5a325d12-fc69-40c7-a5dd-aa1bb836aedd\") " pod="kube-system/global-pull-secret-syncer-569hm" Apr 16 18:14:16.303995 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.303806 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5a325d12-fc69-40c7-a5dd-aa1bb836aedd-kubelet-config\") pod \"global-pull-secret-syncer-569hm\" (UID: \"5a325d12-fc69-40c7-a5dd-aa1bb836aedd\") " pod="kube-system/global-pull-secret-syncer-569hm" Apr 16 18:14:16.303995 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.303873 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5a325d12-fc69-40c7-a5dd-aa1bb836aedd-kubelet-config\") pod \"global-pull-secret-syncer-569hm\" (UID: \"5a325d12-fc69-40c7-a5dd-aa1bb836aedd\") " pod="kube-system/global-pull-secret-syncer-569hm" Apr 16 18:14:16.303995 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.303931 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5a325d12-fc69-40c7-a5dd-aa1bb836aedd-dbus\") pod \"global-pull-secret-syncer-569hm\" (UID: \"5a325d12-fc69-40c7-a5dd-aa1bb836aedd\") " pod="kube-system/global-pull-secret-syncer-569hm" Apr 16 18:14:16.306222 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.306202 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5a325d12-fc69-40c7-a5dd-aa1bb836aedd-original-pull-secret\") pod \"global-pull-secret-syncer-569hm\" (UID: \"5a325d12-fc69-40c7-a5dd-aa1bb836aedd\") " pod="kube-system/global-pull-secret-syncer-569hm" Apr 16 18:14:16.448526 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.448437 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-569hm" Apr 16 18:14:16.570478 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.570441 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-569hm"] Apr 16 18:14:16.573916 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:14:16.573876 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a325d12_fc69_40c7_a5dd_aa1bb836aedd.slice/crio-4fe0c7531380548df9f44716738e01e4835bdb579218ec14b03628927d77d40d WatchSource:0}: Error finding container 4fe0c7531380548df9f44716738e01e4835bdb579218ec14b03628927d77d40d: Status 404 returned error can't find the container with id 4fe0c7531380548df9f44716738e01e4835bdb579218ec14b03628927d77d40d Apr 16 18:14:16.887864 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:16.887823 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-569hm" event={"ID":"5a325d12-fc69-40c7-a5dd-aa1bb836aedd","Type":"ContainerStarted","Data":"4fe0c7531380548df9f44716738e01e4835bdb579218ec14b03628927d77d40d"} Apr 16 18:14:21.905687 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:21.905648 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-569hm" event={"ID":"5a325d12-fc69-40c7-a5dd-aa1bb836aedd","Type":"ContainerStarted","Data":"46a87831f2acce0d10bee473f99bc36e64511ebb73fa5ecaf5f7d29ee5049cc3"} Apr 16 18:14:21.922119 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:21.922037 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-569hm" podStartSLOduration=1.595746633 podStartE2EDuration="5.922013327s" podCreationTimestamp="2026-04-16 18:14:16 +0000 UTC" firstStartedPulling="2026-04-16 18:14:16.575691847 +0000 UTC m=+257.164890112" lastFinishedPulling="2026-04-16 18:14:20.901958543 +0000 UTC m=+261.491156806" observedRunningTime="2026-04-16 18:14:21.919675387 +0000 UTC m=+262.508873675" watchObservedRunningTime="2026-04-16 18:14:21.922013327 +0000 UTC m=+262.511211614" Apr 16 18:14:59.875449 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:59.875420 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:14:59.878189 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:59.878167 2583 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:14:59.879855 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:14:59.879838 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:17:25.622689 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:25.622559 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-pcp7k"] Apr 16 18:17:25.624834 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:25.624819 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-pcp7k" Apr 16 18:17:25.627060 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:25.627037 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:17:25.627215 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:25.627036 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 18:17:25.627215 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:25.627038 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:17:25.627555 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:25.627541 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-wxgt5\"" Apr 16 18:17:25.633956 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:25.633932 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-pcp7k"] Apr 16 18:17:25.779693 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:25.779645 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75vjj\" (UniqueName: \"kubernetes.io/projected/1babc621-fc6c-4c0e-815b-531d596a1b08-kube-api-access-75vjj\") pod \"odh-model-controller-696fc77849-pcp7k\" (UID: \"1babc621-fc6c-4c0e-815b-531d596a1b08\") " pod="kserve/odh-model-controller-696fc77849-pcp7k" Apr 16 18:17:25.779693 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:25.779699 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1babc621-fc6c-4c0e-815b-531d596a1b08-cert\") pod \"odh-model-controller-696fc77849-pcp7k\" (UID: \"1babc621-fc6c-4c0e-815b-531d596a1b08\") " pod="kserve/odh-model-controller-696fc77849-pcp7k" Apr 16 18:17:25.880563 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:25.880463 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75vjj\" (UniqueName: \"kubernetes.io/projected/1babc621-fc6c-4c0e-815b-531d596a1b08-kube-api-access-75vjj\") pod \"odh-model-controller-696fc77849-pcp7k\" (UID: \"1babc621-fc6c-4c0e-815b-531d596a1b08\") " pod="kserve/odh-model-controller-696fc77849-pcp7k" Apr 16 18:17:25.880563 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:25.880506 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1babc621-fc6c-4c0e-815b-531d596a1b08-cert\") pod \"odh-model-controller-696fc77849-pcp7k\" (UID: \"1babc621-fc6c-4c0e-815b-531d596a1b08\") " pod="kserve/odh-model-controller-696fc77849-pcp7k" Apr 16 18:17:25.880772 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:17:25.880661 2583 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 18:17:25.880772 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:17:25.880714 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1babc621-fc6c-4c0e-815b-531d596a1b08-cert podName:1babc621-fc6c-4c0e-815b-531d596a1b08 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:26.380699099 +0000 UTC m=+446.969897367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1babc621-fc6c-4c0e-815b-531d596a1b08-cert") pod "odh-model-controller-696fc77849-pcp7k" (UID: "1babc621-fc6c-4c0e-815b-531d596a1b08") : secret "odh-model-controller-webhook-cert" not found Apr 16 18:17:25.891001 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:25.890977 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75vjj\" (UniqueName: \"kubernetes.io/projected/1babc621-fc6c-4c0e-815b-531d596a1b08-kube-api-access-75vjj\") pod \"odh-model-controller-696fc77849-pcp7k\" (UID: \"1babc621-fc6c-4c0e-815b-531d596a1b08\") " pod="kserve/odh-model-controller-696fc77849-pcp7k" Apr 16 18:17:26.385842 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:26.385804 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1babc621-fc6c-4c0e-815b-531d596a1b08-cert\") pod \"odh-model-controller-696fc77849-pcp7k\" (UID: \"1babc621-fc6c-4c0e-815b-531d596a1b08\") " pod="kserve/odh-model-controller-696fc77849-pcp7k" Apr 16 18:17:26.388387 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:26.388367 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1babc621-fc6c-4c0e-815b-531d596a1b08-cert\") pod \"odh-model-controller-696fc77849-pcp7k\" (UID: \"1babc621-fc6c-4c0e-815b-531d596a1b08\") " pod="kserve/odh-model-controller-696fc77849-pcp7k" Apr 16 18:17:26.535933 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:26.535904 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-pcp7k" Apr 16 18:17:26.671879 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:26.671855 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-pcp7k"] Apr 16 18:17:26.674554 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:17:26.674526 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1babc621_fc6c_4c0e_815b_531d596a1b08.slice/crio-e573ff5e69168947d12315b7b495264e93f036fe2f07f349b1ed61a2c31b520b WatchSource:0}: Error finding container e573ff5e69168947d12315b7b495264e93f036fe2f07f349b1ed61a2c31b520b: Status 404 returned error can't find the container with id e573ff5e69168947d12315b7b495264e93f036fe2f07f349b1ed61a2c31b520b Apr 16 18:17:26.675784 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:26.675769 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:17:27.399185 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:27.399104 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-pcp7k" event={"ID":"1babc621-fc6c-4c0e-815b-531d596a1b08","Type":"ContainerStarted","Data":"e573ff5e69168947d12315b7b495264e93f036fe2f07f349b1ed61a2c31b520b"} Apr 16 18:17:29.406078 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:29.406033 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-pcp7k" event={"ID":"1babc621-fc6c-4c0e-815b-531d596a1b08","Type":"ContainerStarted","Data":"44c39fd2cd3a9cefb34803dba587675099f41e2b4128a57a8ddf7e64621c79ee"} Apr 16 18:17:29.406605 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:29.406243 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-pcp7k" Apr 16 18:17:29.435240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:29.435187 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-pcp7k" podStartSLOduration=2.031633365 podStartE2EDuration="4.435171113s" podCreationTimestamp="2026-04-16 18:17:25 +0000 UTC" firstStartedPulling="2026-04-16 18:17:26.675888709 +0000 UTC m=+447.265086972" lastFinishedPulling="2026-04-16 18:17:29.079426451 +0000 UTC m=+449.668624720" observedRunningTime="2026-04-16 18:17:29.433749934 +0000 UTC m=+450.022948242" watchObservedRunningTime="2026-04-16 18:17:29.435171113 +0000 UTC m=+450.024369478" Apr 16 18:17:30.652735 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.652704 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f55d4c9bf-96jqv"] Apr 16 18:17:30.654878 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.654859 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.657103 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.657082 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:17:30.657786 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.657757 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:17:30.657871 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.657816 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:17:30.657931 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.657877 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:17:30.657931 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.657888 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:17:30.658024 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.657939 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:17:30.658233 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.658210 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-szppp\"" Apr 16 18:17:30.658354 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.658271 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:17:30.663300 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.663283 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:17:30.667557 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.667534 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f55d4c9bf-96jqv"] Apr 16 18:17:30.722415 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.722377 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23874d6f-d723-4869-a1ef-da1eef2c795b-oauth-serving-cert\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.722696 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.722449 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23874d6f-d723-4869-a1ef-da1eef2c795b-trusted-ca-bundle\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.722696 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.722530 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23874d6f-d723-4869-a1ef-da1eef2c795b-service-ca\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.722696 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.722570 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdt7h\" (UniqueName: \"kubernetes.io/projected/23874d6f-d723-4869-a1ef-da1eef2c795b-kube-api-access-bdt7h\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.722696 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.722640 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23874d6f-d723-4869-a1ef-da1eef2c795b-console-oauth-config\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.722696 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.722677 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23874d6f-d723-4869-a1ef-da1eef2c795b-console-serving-cert\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.722910 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.722716 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23874d6f-d723-4869-a1ef-da1eef2c795b-console-config\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.823911 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.823874 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23874d6f-d723-4869-a1ef-da1eef2c795b-oauth-serving-cert\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.824110 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.823948 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23874d6f-d723-4869-a1ef-da1eef2c795b-trusted-ca-bundle\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.824110 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.823990 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23874d6f-d723-4869-a1ef-da1eef2c795b-service-ca\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.824110 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.824013 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdt7h\" (UniqueName: \"kubernetes.io/projected/23874d6f-d723-4869-a1ef-da1eef2c795b-kube-api-access-bdt7h\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.824110 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.824038 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23874d6f-d723-4869-a1ef-da1eef2c795b-console-oauth-config\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.824110 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.824071 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23874d6f-d723-4869-a1ef-da1eef2c795b-console-serving-cert\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.824110 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.824103 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23874d6f-d723-4869-a1ef-da1eef2c795b-console-config\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.824784 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.824754 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23874d6f-d723-4869-a1ef-da1eef2c795b-oauth-serving-cert\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.824951 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.824822 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23874d6f-d723-4869-a1ef-da1eef2c795b-service-ca\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.824951 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.824921 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23874d6f-d723-4869-a1ef-da1eef2c795b-console-config\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.825059 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.824966 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23874d6f-d723-4869-a1ef-da1eef2c795b-trusted-ca-bundle\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.826721 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.826704 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23874d6f-d723-4869-a1ef-da1eef2c795b-console-serving-cert\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.826913 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.826897 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23874d6f-d723-4869-a1ef-da1eef2c795b-console-oauth-config\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.832680 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.832648 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdt7h\" (UniqueName: \"kubernetes.io/projected/23874d6f-d723-4869-a1ef-da1eef2c795b-kube-api-access-bdt7h\") pod \"console-5f55d4c9bf-96jqv\" (UID: \"23874d6f-d723-4869-a1ef-da1eef2c795b\") " pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:30.965194 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:30.965094 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:31.088755 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:31.088696 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f55d4c9bf-96jqv"] Apr 16 18:17:31.093514 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:17:31.093486 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23874d6f_d723_4869_a1ef_da1eef2c795b.slice/crio-7b1ccbf75450a646e57330c81572d760e60e76f39a1ffd3aa855939941b21051 WatchSource:0}: Error finding container 7b1ccbf75450a646e57330c81572d760e60e76f39a1ffd3aa855939941b21051: Status 404 returned error can't find the container with id 7b1ccbf75450a646e57330c81572d760e60e76f39a1ffd3aa855939941b21051 Apr 16 18:17:31.412833 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:31.412796 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f55d4c9bf-96jqv" event={"ID":"23874d6f-d723-4869-a1ef-da1eef2c795b","Type":"ContainerStarted","Data":"6e803bf5ebfd22155fbabcfb899308664d2d1e34bca65dab95e55ecb9f45ef08"} Apr 16 18:17:31.412833 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:31.412837 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f55d4c9bf-96jqv" event={"ID":"23874d6f-d723-4869-a1ef-da1eef2c795b","Type":"ContainerStarted","Data":"7b1ccbf75450a646e57330c81572d760e60e76f39a1ffd3aa855939941b21051"} Apr 16 18:17:31.428726 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:31.428667 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f55d4c9bf-96jqv" podStartSLOduration=1.428648466 podStartE2EDuration="1.428648466s" podCreationTimestamp="2026-04-16 18:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:31.427972601 +0000 UTC m=+452.017170888" watchObservedRunningTime="2026-04-16 18:17:31.428648466 +0000 UTC m=+452.017846752" Apr 16 18:17:40.411566 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:40.411538 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-pcp7k" Apr 16 18:17:40.966048 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:40.966006 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:40.966237 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:40.966061 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:40.970624 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:40.970602 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:41.321286 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:41.321211 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-l6mrt"] Apr 16 18:17:41.323474 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:41.323459 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-l6mrt" Apr 16 18:17:41.325452 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:41.325431 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:17:41.325600 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:41.325457 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vzcsd\"" Apr 16 18:17:41.330339 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:41.330311 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-l6mrt"] Apr 16 18:17:41.418643 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:41.418605 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79fdl\" (UniqueName: \"kubernetes.io/projected/b7435030-4cdd-4af7-9fe7-2940c8e93876-kube-api-access-79fdl\") pod \"s3-init-l6mrt\" (UID: \"b7435030-4cdd-4af7-9fe7-2940c8e93876\") " pod="kserve/s3-init-l6mrt" Apr 16 18:17:41.447592 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:41.447543 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f55d4c9bf-96jqv" Apr 16 18:17:41.519242 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:41.519210 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79fdl\" (UniqueName: \"kubernetes.io/projected/b7435030-4cdd-4af7-9fe7-2940c8e93876-kube-api-access-79fdl\") pod \"s3-init-l6mrt\" (UID: \"b7435030-4cdd-4af7-9fe7-2940c8e93876\") " pod="kserve/s3-init-l6mrt" Apr 16 18:17:41.528443 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:41.528419 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fdl\" (UniqueName: \"kubernetes.io/projected/b7435030-4cdd-4af7-9fe7-2940c8e93876-kube-api-access-79fdl\") pod \"s3-init-l6mrt\" (UID: \"b7435030-4cdd-4af7-9fe7-2940c8e93876\") " pod="kserve/s3-init-l6mrt" Apr 16 18:17:41.645271 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:41.645243 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-l6mrt" Apr 16 18:17:41.765427 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:41.765393 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-l6mrt"] Apr 16 18:17:41.768776 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:17:41.768746 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7435030_4cdd_4af7_9fe7_2940c8e93876.slice/crio-8fe4db7feda6e0822e1c4c26d6ce53393327c23296b0fc633e495876e4a333a7 WatchSource:0}: Error finding container 8fe4db7feda6e0822e1c4c26d6ce53393327c23296b0fc633e495876e4a333a7: Status 404 returned error can't find the container with id 8fe4db7feda6e0822e1c4c26d6ce53393327c23296b0fc633e495876e4a333a7 Apr 16 18:17:42.449355 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:42.449322 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-l6mrt" event={"ID":"b7435030-4cdd-4af7-9fe7-2940c8e93876","Type":"ContainerStarted","Data":"8fe4db7feda6e0822e1c4c26d6ce53393327c23296b0fc633e495876e4a333a7"} Apr 16 18:17:46.462990 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:46.462896 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-l6mrt" event={"ID":"b7435030-4cdd-4af7-9fe7-2940c8e93876","Type":"ContainerStarted","Data":"38b04fbd53d880431d9f3bed5f15fc2fa0d5c5b40c78d33a1a722634e4b33037"} Apr 16 18:17:46.476238 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:46.476183 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-l6mrt" podStartSLOduration=1.087855192 podStartE2EDuration="5.476166162s" podCreationTimestamp="2026-04-16 18:17:41 +0000 UTC" firstStartedPulling="2026-04-16 18:17:41.770531934 +0000 UTC m=+462.359730197" lastFinishedPulling="2026-04-16 18:17:46.158842897 +0000 UTC m=+466.748041167" observedRunningTime="2026-04-16 18:17:46.475416642 +0000 UTC m=+467.064614953" watchObservedRunningTime="2026-04-16 18:17:46.476166162 +0000 UTC m=+467.065364449" Apr 16 18:17:49.473191 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:49.473152 2583 generic.go:358] "Generic (PLEG): container finished" podID="b7435030-4cdd-4af7-9fe7-2940c8e93876" containerID="38b04fbd53d880431d9f3bed5f15fc2fa0d5c5b40c78d33a1a722634e4b33037" exitCode=0 Apr 16 18:17:49.473557 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:49.473225 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-l6mrt" event={"ID":"b7435030-4cdd-4af7-9fe7-2940c8e93876","Type":"ContainerDied","Data":"38b04fbd53d880431d9f3bed5f15fc2fa0d5c5b40c78d33a1a722634e4b33037"} Apr 16 18:17:50.599431 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:50.599408 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-l6mrt" Apr 16 18:17:50.698067 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:50.698033 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79fdl\" (UniqueName: \"kubernetes.io/projected/b7435030-4cdd-4af7-9fe7-2940c8e93876-kube-api-access-79fdl\") pod \"b7435030-4cdd-4af7-9fe7-2940c8e93876\" (UID: \"b7435030-4cdd-4af7-9fe7-2940c8e93876\") " Apr 16 18:17:50.700350 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:50.700325 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7435030-4cdd-4af7-9fe7-2940c8e93876-kube-api-access-79fdl" (OuterVolumeSpecName: "kube-api-access-79fdl") pod "b7435030-4cdd-4af7-9fe7-2940c8e93876" (UID: "b7435030-4cdd-4af7-9fe7-2940c8e93876"). InnerVolumeSpecName "kube-api-access-79fdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:17:50.799550 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:50.799466 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-79fdl\" (UniqueName: \"kubernetes.io/projected/b7435030-4cdd-4af7-9fe7-2940c8e93876-kube-api-access-79fdl\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:17:51.480222 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:51.480179 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-l6mrt" event={"ID":"b7435030-4cdd-4af7-9fe7-2940c8e93876","Type":"ContainerDied","Data":"8fe4db7feda6e0822e1c4c26d6ce53393327c23296b0fc633e495876e4a333a7"} Apr 16 18:17:51.480222 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:51.480206 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-l6mrt" Apr 16 18:17:51.480475 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:17:51.480213 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fe4db7feda6e0822e1c4c26d6ce53393327c23296b0fc633e495876e4a333a7" Apr 16 18:18:02.215574 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.215544 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr"] Apr 16 18:18:02.216043 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.215853 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7435030-4cdd-4af7-9fe7-2940c8e93876" containerName="s3-init" Apr 16 18:18:02.216043 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.215865 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7435030-4cdd-4af7-9fe7-2940c8e93876" containerName="s3-init" Apr 16 18:18:02.216043 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.215916 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7435030-4cdd-4af7-9fe7-2940c8e93876" containerName="s3-init" Apr 16 18:18:02.221047 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.221027 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" Apr 16 18:18:02.223059 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.223040 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7d9sz\"" Apr 16 18:18:02.228009 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.227986 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr"] Apr 16 18:18:02.291088 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.291054 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb6437f7-3993-4858-a222-e997b89c17a6-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr\" (UID: \"cb6437f7-3993-4858-a222-e997b89c17a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" Apr 16 18:18:02.391931 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.391900 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb6437f7-3993-4858-a222-e997b89c17a6-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr\" (UID: \"cb6437f7-3993-4858-a222-e997b89c17a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" Apr 16 18:18:02.392310 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.392290 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb6437f7-3993-4858-a222-e997b89c17a6-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr\" (UID: \"cb6437f7-3993-4858-a222-e997b89c17a6\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" Apr 16 18:18:02.415413 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.415381 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4"] Apr 16 18:18:02.421840 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.421817 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" Apr 16 18:18:02.427361 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.427334 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4"] Apr 16 18:18:02.492894 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.492809 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5bba13cd-dc2d-45fe-ba30-3cdccac702a1-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-89nm4\" (UID: \"5bba13cd-dc2d-45fe-ba30-3cdccac702a1\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" Apr 16 18:18:02.531623 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.531550 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" Apr 16 18:18:02.594300 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.594247 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5bba13cd-dc2d-45fe-ba30-3cdccac702a1-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-89nm4\" (UID: \"5bba13cd-dc2d-45fe-ba30-3cdccac702a1\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" Apr 16 18:18:02.594850 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.594826 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5bba13cd-dc2d-45fe-ba30-3cdccac702a1-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-89nm4\" (UID: \"5bba13cd-dc2d-45fe-ba30-3cdccac702a1\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" Apr 16 18:18:02.661798 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.661753 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr"] Apr 16 18:18:02.664613 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:18:02.664565 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb6437f7_3993_4858_a222_e997b89c17a6.slice/crio-2a8dca4eb723217610ff4a9e7006116256c24aea61f51ad4e48159ce7bf5bdf7 WatchSource:0}: Error finding container 2a8dca4eb723217610ff4a9e7006116256c24aea61f51ad4e48159ce7bf5bdf7: Status 404 returned error can't find the container with id 2a8dca4eb723217610ff4a9e7006116256c24aea61f51ad4e48159ce7bf5bdf7 Apr 16 18:18:02.735613 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.735561 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" Apr 16 18:18:02.857332 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:02.857305 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4"] Apr 16 18:18:02.859794 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:18:02.859767 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bba13cd_dc2d_45fe_ba30_3cdccac702a1.slice/crio-d4e7776d79b67472df8472b0591ed82913c458bd1d79e1886abd4868bf9880ee WatchSource:0}: Error finding container d4e7776d79b67472df8472b0591ed82913c458bd1d79e1886abd4868bf9880ee: Status 404 returned error can't find the container with id d4e7776d79b67472df8472b0591ed82913c458bd1d79e1886abd4868bf9880ee Apr 16 18:18:03.517226 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:03.517184 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" event={"ID":"5bba13cd-dc2d-45fe-ba30-3cdccac702a1","Type":"ContainerStarted","Data":"d4e7776d79b67472df8472b0591ed82913c458bd1d79e1886abd4868bf9880ee"} Apr 16 18:18:03.519210 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:03.519175 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" event={"ID":"cb6437f7-3993-4858-a222-e997b89c17a6","Type":"ContainerStarted","Data":"2a8dca4eb723217610ff4a9e7006116256c24aea61f51ad4e48159ce7bf5bdf7"} Apr 16 18:18:08.537200 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:08.537158 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" event={"ID":"5bba13cd-dc2d-45fe-ba30-3cdccac702a1","Type":"ContainerStarted","Data":"b2f0e6123dbbd50f6e6b3d109bab6fff2c4726f99a844cc1dcc27b22726be321"} Apr 16 18:18:08.538494 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:08.538467 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" event={"ID":"cb6437f7-3993-4858-a222-e997b89c17a6","Type":"ContainerStarted","Data":"29c9925d1c7d11b4bcbfa9882742166efad2392e9fbc14a6946e26e5a3480ef7"} Apr 16 18:18:11.548804 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:11.548770 2583 generic.go:358] "Generic (PLEG): container finished" podID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerID="b2f0e6123dbbd50f6e6b3d109bab6fff2c4726f99a844cc1dcc27b22726be321" exitCode=0 Apr 16 18:18:11.549242 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:11.548840 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" event={"ID":"5bba13cd-dc2d-45fe-ba30-3cdccac702a1","Type":"ContainerDied","Data":"b2f0e6123dbbd50f6e6b3d109bab6fff2c4726f99a844cc1dcc27b22726be321"} Apr 16 18:18:11.550298 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:11.550273 2583 generic.go:358] "Generic (PLEG): container finished" podID="cb6437f7-3993-4858-a222-e997b89c17a6" containerID="29c9925d1c7d11b4bcbfa9882742166efad2392e9fbc14a6946e26e5a3480ef7" exitCode=0 Apr 16 18:18:11.550394 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:11.550347 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" event={"ID":"cb6437f7-3993-4858-a222-e997b89c17a6","Type":"ContainerDied","Data":"29c9925d1c7d11b4bcbfa9882742166efad2392e9fbc14a6946e26e5a3480ef7"} Apr 16 18:18:37.648072 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:37.648029 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" event={"ID":"cb6437f7-3993-4858-a222-e997b89c17a6","Type":"ContainerStarted","Data":"056870b25431e32ccf481538646ccd3e598ff7befadda277dc318584618ca99c"} Apr 16 18:18:37.648574 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:37.648357 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" Apr 16 18:18:37.649926 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:37.649882 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:18:37.650036 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:37.649981 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" event={"ID":"5bba13cd-dc2d-45fe-ba30-3cdccac702a1","Type":"ContainerStarted","Data":"6144e26c57a741f2e99c32e6e0a5d2e37db22cc7c23cb0d7e942ad6394841693"} Apr 16 18:18:37.650271 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:37.650257 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" Apr 16 18:18:37.651259 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:37.651237 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:18:37.664171 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:37.664115 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" podStartSLOduration=1.731206247 podStartE2EDuration="35.664099893s" podCreationTimestamp="2026-04-16 18:18:02 +0000 UTC" firstStartedPulling="2026-04-16 18:18:02.666429039 +0000 UTC m=+483.255627315" lastFinishedPulling="2026-04-16 18:18:36.599322698 +0000 UTC m=+517.188520961" observedRunningTime="2026-04-16 18:18:37.662825947 +0000 UTC m=+518.252024233" watchObservedRunningTime="2026-04-16 18:18:37.664099893 +0000 UTC m=+518.253298178" Apr 16 18:18:37.677509 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:37.677454 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" podStartSLOduration=1.462694254 podStartE2EDuration="35.677437501s" podCreationTimestamp="2026-04-16 18:18:02 +0000 UTC" firstStartedPulling="2026-04-16 18:18:02.861854468 +0000 UTC m=+483.451052732" lastFinishedPulling="2026-04-16 18:18:37.07659771 +0000 UTC m=+517.665795979" observedRunningTime="2026-04-16 18:18:37.675865174 +0000 UTC m=+518.265063470" watchObservedRunningTime="2026-04-16 18:18:37.677437501 +0000 UTC m=+518.266635787" Apr 16 18:18:38.653405 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:38.653362 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:18:38.653405 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:38.653381 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:18:48.654071 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:48.654024 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:18:48.654457 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:48.654024 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:18:58.654414 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:58.654371 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:18:58.654850 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:18:58.654373 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:19:08.653848 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:08.653801 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:19:08.654363 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:08.653802 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:19:18.653732 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:18.653690 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:19:18.654111 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:18.653690 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:19:22.211498 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.211465 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw"] Apr 16 18:19:22.214654 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.214639 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" Apr 16 18:19:22.216739 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.216712 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:19:22.216874 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.216742 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-47091-kube-rbac-proxy-sar-config\"" Apr 16 18:19:22.216999 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.216985 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-47091-serving-cert\"" Apr 16 18:19:22.222274 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.222255 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw"] Apr 16 18:19:22.392765 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.392731 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/833b85dc-abb2-43cf-945e-7f65358019e9-proxy-tls\") pod \"switch-graph-47091-595cb5c45c-pcgzw\" (UID: \"833b85dc-abb2-43cf-945e-7f65358019e9\") " pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" Apr 16 18:19:22.392939 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.392779 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833b85dc-abb2-43cf-945e-7f65358019e9-openshift-service-ca-bundle\") pod \"switch-graph-47091-595cb5c45c-pcgzw\" (UID: \"833b85dc-abb2-43cf-945e-7f65358019e9\") " pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" Apr 16 18:19:22.494051 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.493954 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833b85dc-abb2-43cf-945e-7f65358019e9-openshift-service-ca-bundle\") pod \"switch-graph-47091-595cb5c45c-pcgzw\" (UID: \"833b85dc-abb2-43cf-945e-7f65358019e9\") " pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" Apr 16 18:19:22.494229 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.494053 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/833b85dc-abb2-43cf-945e-7f65358019e9-proxy-tls\") pod \"switch-graph-47091-595cb5c45c-pcgzw\" (UID: \"833b85dc-abb2-43cf-945e-7f65358019e9\") " pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" Apr 16 18:19:22.494670 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.494643 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833b85dc-abb2-43cf-945e-7f65358019e9-openshift-service-ca-bundle\") pod \"switch-graph-47091-595cb5c45c-pcgzw\" (UID: \"833b85dc-abb2-43cf-945e-7f65358019e9\") " pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" Apr 16 18:19:22.496519 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.496501 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/833b85dc-abb2-43cf-945e-7f65358019e9-proxy-tls\") pod \"switch-graph-47091-595cb5c45c-pcgzw\" (UID: \"833b85dc-abb2-43cf-945e-7f65358019e9\") " pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" Apr 16 18:19:22.525716 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.525680 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" Apr 16 18:19:22.649091 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.649064 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw"] Apr 16 18:19:22.651769 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:19:22.651735 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod833b85dc_abb2_43cf_945e_7f65358019e9.slice/crio-0cc6461c68cdd0697527e0a676e898f0bb5e760f6e2bd1f73410a670a650f588 WatchSource:0}: Error finding container 0cc6461c68cdd0697527e0a676e898f0bb5e760f6e2bd1f73410a670a650f588: Status 404 returned error can't find the container with id 0cc6461c68cdd0697527e0a676e898f0bb5e760f6e2bd1f73410a670a650f588 Apr 16 18:19:22.778414 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:22.778331 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" event={"ID":"833b85dc-abb2-43cf-945e-7f65358019e9","Type":"ContainerStarted","Data":"0cc6461c68cdd0697527e0a676e898f0bb5e760f6e2bd1f73410a670a650f588"} Apr 16 18:19:25.788989 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:25.788956 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" event={"ID":"833b85dc-abb2-43cf-945e-7f65358019e9","Type":"ContainerStarted","Data":"f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f"} Apr 16 18:19:25.789484 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:25.789041 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" Apr 16 18:19:25.804865 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:25.804807 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" podStartSLOduration=1.176090388 podStartE2EDuration="3.80479347s" podCreationTimestamp="2026-04-16 18:19:22 +0000 UTC" firstStartedPulling="2026-04-16 18:19:22.65369673 +0000 UTC m=+563.242894995" lastFinishedPulling="2026-04-16 18:19:25.282399812 +0000 UTC m=+565.871598077" observedRunningTime="2026-04-16 18:19:25.802643613 +0000 UTC m=+566.391841897" watchObservedRunningTime="2026-04-16 18:19:25.80479347 +0000 UTC m=+566.393991755" Apr 16 18:19:28.654411 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:28.654370 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 16 18:19:28.654849 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:28.654370 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:19:31.799999 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:31.799973 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" Apr 16 18:19:32.384051 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:32.384011 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw"] Apr 16 18:19:32.384366 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:32.384313 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" podUID="833b85dc-abb2-43cf-945e-7f65358019e9" containerName="switch-graph-47091" containerID="cri-o://f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f" gracePeriod=30 Apr 16 18:19:36.798137 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:36.798101 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" podUID="833b85dc-abb2-43cf-945e-7f65358019e9" containerName="switch-graph-47091" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:19:38.654125 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:38.654084 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:19:38.654799 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:38.654761 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" Apr 16 18:19:41.798747 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:41.798703 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" podUID="833b85dc-abb2-43cf-945e-7f65358019e9" containerName="switch-graph-47091" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:19:45.992082 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:45.992052 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" Apr 16 18:19:46.799475 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:46.799426 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" podUID="833b85dc-abb2-43cf-945e-7f65358019e9" containerName="switch-graph-47091" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:19:46.799676 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:46.799529 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" Apr 16 18:19:51.798399 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:51.798364 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" podUID="833b85dc-abb2-43cf-945e-7f65358019e9" containerName="switch-graph-47091" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:19:56.797897 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:56.797859 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" podUID="833b85dc-abb2-43cf-945e-7f65358019e9" containerName="switch-graph-47091" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:19:59.900634 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:59.900603 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:19:59.904175 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:19:59.904151 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:20:01.798172 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:01.798130 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" podUID="833b85dc-abb2-43cf-945e-7f65358019e9" containerName="switch-graph-47091" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:02.429784 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:20:02.429747 2583 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod833b85dc_abb2_43cf_945e_7f65358019e9.slice/crio-conmon-f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f.scope\": RecentStats: unable to find data in memory cache]" Apr 16 18:20:02.520381 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.520352 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" Apr 16 18:20:02.628866 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.628833 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833b85dc-abb2-43cf-945e-7f65358019e9-openshift-service-ca-bundle\") pod \"833b85dc-abb2-43cf-945e-7f65358019e9\" (UID: \"833b85dc-abb2-43cf-945e-7f65358019e9\") " Apr 16 18:20:02.629158 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.628934 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/833b85dc-abb2-43cf-945e-7f65358019e9-proxy-tls\") pod \"833b85dc-abb2-43cf-945e-7f65358019e9\" (UID: \"833b85dc-abb2-43cf-945e-7f65358019e9\") " Apr 16 18:20:02.629233 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.629176 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/833b85dc-abb2-43cf-945e-7f65358019e9-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "833b85dc-abb2-43cf-945e-7f65358019e9" (UID: "833b85dc-abb2-43cf-945e-7f65358019e9"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:02.631335 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.631305 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833b85dc-abb2-43cf-945e-7f65358019e9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "833b85dc-abb2-43cf-945e-7f65358019e9" (UID: "833b85dc-abb2-43cf-945e-7f65358019e9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:02.730115 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.730008 2583 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/833b85dc-abb2-43cf-945e-7f65358019e9-proxy-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:20:02.730115 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.730054 2583 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/833b85dc-abb2-43cf-945e-7f65358019e9-openshift-service-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:20:02.905560 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.905524 2583 generic.go:358] "Generic (PLEG): container finished" podID="833b85dc-abb2-43cf-945e-7f65358019e9" containerID="f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f" exitCode=0 Apr 16 18:20:02.906013 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.905619 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" event={"ID":"833b85dc-abb2-43cf-945e-7f65358019e9","Type":"ContainerDied","Data":"f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f"} Apr 16 18:20:02.906013 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.905657 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" event={"ID":"833b85dc-abb2-43cf-945e-7f65358019e9","Type":"ContainerDied","Data":"0cc6461c68cdd0697527e0a676e898f0bb5e760f6e2bd1f73410a670a650f588"} Apr 16 18:20:02.906013 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.905673 2583 scope.go:117] "RemoveContainer" containerID="f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f" Apr 16 18:20:02.906013 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.905631 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw" Apr 16 18:20:02.914573 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.914557 2583 scope.go:117] "RemoveContainer" containerID="f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f" Apr 16 18:20:02.914869 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:20:02.914851 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f\": container with ID starting with f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f not found: ID does not exist" containerID="f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f" Apr 16 18:20:02.914934 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.914878 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f"} err="failed to get container status \"f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f\": rpc error: code = NotFound desc = could not find container \"f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f\": container with ID starting with f7c9e12f3f381c8c6add585de86e2c72d1acc1b8ddf2e8a2ab75ccbe5831d74f not found: ID does not exist" Apr 16 18:20:02.924804 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.924780 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw"] Apr 16 18:20:02.927112 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:02.927091 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-47091-595cb5c45c-pcgzw"] Apr 16 18:20:03.992074 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:03.992042 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="833b85dc-abb2-43cf-945e-7f65358019e9" path="/var/lib/kubelet/pods/833b85dc-abb2-43cf-945e-7f65358019e9/volumes" Apr 16 18:20:12.172651 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.172606 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns"] Apr 16 18:20:12.173053 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.172978 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="833b85dc-abb2-43cf-945e-7f65358019e9" containerName="switch-graph-47091" Apr 16 18:20:12.173053 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.172996 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="833b85dc-abb2-43cf-945e-7f65358019e9" containerName="switch-graph-47091" Apr 16 18:20:12.173125 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.173069 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="833b85dc-abb2-43cf-945e-7f65358019e9" containerName="switch-graph-47091" Apr 16 18:20:12.175872 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.175855 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" Apr 16 18:20:12.178038 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.178014 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 16 18:20:12.178038 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.178029 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 16 18:20:12.178225 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.178022 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:20:12.182990 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.182572 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns"] Apr 16 18:20:12.316177 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.316124 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2841e7a-1489-4ee2-867d-68ad0e8e8a68-openshift-service-ca-bundle\") pod \"model-chainer-55559dfcf8-vvdns\" (UID: \"a2841e7a-1489-4ee2-867d-68ad0e8e8a68\") " pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" Apr 16 18:20:12.316366 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.316189 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2841e7a-1489-4ee2-867d-68ad0e8e8a68-proxy-tls\") pod \"model-chainer-55559dfcf8-vvdns\" (UID: \"a2841e7a-1489-4ee2-867d-68ad0e8e8a68\") " pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" Apr 16 18:20:12.417179 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.417147 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2841e7a-1489-4ee2-867d-68ad0e8e8a68-openshift-service-ca-bundle\") pod \"model-chainer-55559dfcf8-vvdns\" (UID: \"a2841e7a-1489-4ee2-867d-68ad0e8e8a68\") " pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" Apr 16 18:20:12.417179 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.417196 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2841e7a-1489-4ee2-867d-68ad0e8e8a68-proxy-tls\") pod \"model-chainer-55559dfcf8-vvdns\" (UID: \"a2841e7a-1489-4ee2-867d-68ad0e8e8a68\") " pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" Apr 16 18:20:12.417794 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.417761 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2841e7a-1489-4ee2-867d-68ad0e8e8a68-openshift-service-ca-bundle\") pod \"model-chainer-55559dfcf8-vvdns\" (UID: \"a2841e7a-1489-4ee2-867d-68ad0e8e8a68\") " pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" Apr 16 18:20:12.419779 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.419758 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2841e7a-1489-4ee2-867d-68ad0e8e8a68-proxy-tls\") pod \"model-chainer-55559dfcf8-vvdns\" (UID: \"a2841e7a-1489-4ee2-867d-68ad0e8e8a68\") " pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" Apr 16 18:20:12.487699 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.487607 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" Apr 16 18:20:12.633828 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.633790 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns"] Apr 16 18:20:12.634062 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:20:12.634026 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2841e7a_1489_4ee2_867d_68ad0e8e8a68.slice/crio-99a78133e139db59e3442acd349a86f34ba40cdc9949eb846846cf6b712b82e0 WatchSource:0}: Error finding container 99a78133e139db59e3442acd349a86f34ba40cdc9949eb846846cf6b712b82e0: Status 404 returned error can't find the container with id 99a78133e139db59e3442acd349a86f34ba40cdc9949eb846846cf6b712b82e0 Apr 16 18:20:12.937443 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.937406 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" event={"ID":"a2841e7a-1489-4ee2-867d-68ad0e8e8a68","Type":"ContainerStarted","Data":"98d5a38d356303ac2222565ff8955c9752b83dca5022dd943c9a56c75ce58ab2"} Apr 16 18:20:12.937443 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.937445 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" event={"ID":"a2841e7a-1489-4ee2-867d-68ad0e8e8a68","Type":"ContainerStarted","Data":"99a78133e139db59e3442acd349a86f34ba40cdc9949eb846846cf6b712b82e0"} Apr 16 18:20:12.937820 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.937554 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" Apr 16 18:20:12.952876 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:12.952826 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" podStartSLOduration=0.952807882 podStartE2EDuration="952.807882ms" podCreationTimestamp="2026-04-16 18:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:12.952006552 +0000 UTC m=+613.541204838" watchObservedRunningTime="2026-04-16 18:20:12.952807882 +0000 UTC m=+613.542006169" Apr 16 18:20:18.946831 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:18.946803 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" Apr 16 18:20:22.291694 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:22.291652 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns"] Apr 16 18:20:22.292161 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:22.291926 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" podUID="a2841e7a-1489-4ee2-867d-68ad0e8e8a68" containerName="model-chainer" containerID="cri-o://98d5a38d356303ac2222565ff8955c9752b83dca5022dd943c9a56c75ce58ab2" gracePeriod=30 Apr 16 18:20:22.353888 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:22.353850 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4"] Apr 16 18:20:22.354151 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:22.354128 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerName="kserve-container" containerID="cri-o://6144e26c57a741f2e99c32e6e0a5d2e37db22cc7c23cb0d7e942ad6394841693" gracePeriod=30 Apr 16 18:20:22.468370 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:22.468328 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr"] Apr 16 18:20:22.468790 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:22.468751 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="kserve-container" containerID="cri-o://056870b25431e32ccf481538646ccd3e598ff7befadda277dc318584618ca99c" gracePeriod=30 Apr 16 18:20:23.944778 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:23.944730 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" podUID="a2841e7a-1489-4ee2-867d-68ad0e8e8a68" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:25.988047 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:25.988010 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 16 18:20:26.202061 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:26.202040 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" Apr 16 18:20:26.233957 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:26.233921 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5bba13cd-dc2d-45fe-ba30-3cdccac702a1-kserve-provision-location\") pod \"5bba13cd-dc2d-45fe-ba30-3cdccac702a1\" (UID: \"5bba13cd-dc2d-45fe-ba30-3cdccac702a1\") " Apr 16 18:20:26.234274 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:26.234247 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bba13cd-dc2d-45fe-ba30-3cdccac702a1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5bba13cd-dc2d-45fe-ba30-3cdccac702a1" (UID: "5bba13cd-dc2d-45fe-ba30-3cdccac702a1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:26.334908 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:26.334810 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5bba13cd-dc2d-45fe-ba30-3cdccac702a1-kserve-provision-location\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:20:26.980837 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:26.980804 2583 generic.go:358] "Generic (PLEG): container finished" podID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerID="6144e26c57a741f2e99c32e6e0a5d2e37db22cc7c23cb0d7e942ad6394841693" exitCode=0 Apr 16 18:20:26.981020 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:26.980887 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" Apr 16 18:20:26.981020 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:26.980885 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" event={"ID":"5bba13cd-dc2d-45fe-ba30-3cdccac702a1","Type":"ContainerDied","Data":"6144e26c57a741f2e99c32e6e0a5d2e37db22cc7c23cb0d7e942ad6394841693"} Apr 16 18:20:26.981020 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:26.980932 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4" event={"ID":"5bba13cd-dc2d-45fe-ba30-3cdccac702a1","Type":"ContainerDied","Data":"d4e7776d79b67472df8472b0591ed82913c458bd1d79e1886abd4868bf9880ee"} Apr 16 18:20:26.981020 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:26.980952 2583 scope.go:117] "RemoveContainer" containerID="6144e26c57a741f2e99c32e6e0a5d2e37db22cc7c23cb0d7e942ad6394841693" Apr 16 18:20:26.982763 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:26.982737 2583 generic.go:358] "Generic (PLEG): container finished" podID="cb6437f7-3993-4858-a222-e997b89c17a6" containerID="056870b25431e32ccf481538646ccd3e598ff7befadda277dc318584618ca99c" exitCode=0 Apr 16 18:20:26.982892 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:26.982772 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" event={"ID":"cb6437f7-3993-4858-a222-e997b89c17a6","Type":"ContainerDied","Data":"056870b25431e32ccf481538646ccd3e598ff7befadda277dc318584618ca99c"} Apr 16 18:20:26.989562 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:26.989543 2583 scope.go:117] "RemoveContainer" containerID="b2f0e6123dbbd50f6e6b3d109bab6fff2c4726f99a844cc1dcc27b22726be321" Apr 16 18:20:27.003459 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.003439 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4"] Apr 16 18:20:27.008468 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.008451 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" Apr 16 18:20:27.008813 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.008797 2583 scope.go:117] "RemoveContainer" containerID="6144e26c57a741f2e99c32e6e0a5d2e37db22cc7c23cb0d7e942ad6394841693" Apr 16 18:20:27.009083 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:20:27.009062 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6144e26c57a741f2e99c32e6e0a5d2e37db22cc7c23cb0d7e942ad6394841693\": container with ID starting with 6144e26c57a741f2e99c32e6e0a5d2e37db22cc7c23cb0d7e942ad6394841693 not found: ID does not exist" containerID="6144e26c57a741f2e99c32e6e0a5d2e37db22cc7c23cb0d7e942ad6394841693" Apr 16 18:20:27.009131 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.009090 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6144e26c57a741f2e99c32e6e0a5d2e37db22cc7c23cb0d7e942ad6394841693"} err="failed to get container status \"6144e26c57a741f2e99c32e6e0a5d2e37db22cc7c23cb0d7e942ad6394841693\": rpc error: code = NotFound desc = could not find container \"6144e26c57a741f2e99c32e6e0a5d2e37db22cc7c23cb0d7e942ad6394841693\": container with ID starting with 6144e26c57a741f2e99c32e6e0a5d2e37db22cc7c23cb0d7e942ad6394841693 not found: ID does not exist" Apr 16 18:20:27.009131 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.009108 2583 scope.go:117] "RemoveContainer" containerID="b2f0e6123dbbd50f6e6b3d109bab6fff2c4726f99a844cc1dcc27b22726be321" Apr 16 18:20:27.009365 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:20:27.009313 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2f0e6123dbbd50f6e6b3d109bab6fff2c4726f99a844cc1dcc27b22726be321\": container with ID starting with b2f0e6123dbbd50f6e6b3d109bab6fff2c4726f99a844cc1dcc27b22726be321 not found: ID does not exist" containerID="b2f0e6123dbbd50f6e6b3d109bab6fff2c4726f99a844cc1dcc27b22726be321" Apr 16 18:20:27.009365 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.009347 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f0e6123dbbd50f6e6b3d109bab6fff2c4726f99a844cc1dcc27b22726be321"} err="failed to get container status \"b2f0e6123dbbd50f6e6b3d109bab6fff2c4726f99a844cc1dcc27b22726be321\": rpc error: code = NotFound desc = could not find container \"b2f0e6123dbbd50f6e6b3d109bab6fff2c4726f99a844cc1dcc27b22726be321\": container with ID starting with b2f0e6123dbbd50f6e6b3d109bab6fff2c4726f99a844cc1dcc27b22726be321 not found: ID does not exist" Apr 16 18:20:27.009700 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.009681 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-89nm4"] Apr 16 18:20:27.038467 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.038391 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb6437f7-3993-4858-a222-e997b89c17a6-kserve-provision-location\") pod \"cb6437f7-3993-4858-a222-e997b89c17a6\" (UID: \"cb6437f7-3993-4858-a222-e997b89c17a6\") " Apr 16 18:20:27.038714 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.038692 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb6437f7-3993-4858-a222-e997b89c17a6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cb6437f7-3993-4858-a222-e997b89c17a6" (UID: "cb6437f7-3993-4858-a222-e997b89c17a6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:27.138989 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.138963 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb6437f7-3993-4858-a222-e997b89c17a6-kserve-provision-location\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:20:27.988221 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.988190 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" Apr 16 18:20:27.990843 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.990815 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" path="/var/lib/kubelet/pods/5bba13cd-dc2d-45fe-ba30-3cdccac702a1/volumes" Apr 16 18:20:27.991214 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.991196 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr" event={"ID":"cb6437f7-3993-4858-a222-e997b89c17a6","Type":"ContainerDied","Data":"2a8dca4eb723217610ff4a9e7006116256c24aea61f51ad4e48159ce7bf5bdf7"} Apr 16 18:20:27.991257 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.991229 2583 scope.go:117] "RemoveContainer" containerID="056870b25431e32ccf481538646ccd3e598ff7befadda277dc318584618ca99c" Apr 16 18:20:27.999700 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:27.999670 2583 scope.go:117] "RemoveContainer" containerID="29c9925d1c7d11b4bcbfa9882742166efad2392e9fbc14a6946e26e5a3480ef7" Apr 16 18:20:28.009975 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:28.009951 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr"] Apr 16 18:20:28.013285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:28.013263 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-69d5d56664-9jmvr"] Apr 16 18:20:28.945090 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:28.945047 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" podUID="a2841e7a-1489-4ee2-867d-68ad0e8e8a68" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:29.991184 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:29.991152 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" path="/var/lib/kubelet/pods/cb6437f7-3993-4858-a222-e997b89c17a6/volumes" Apr 16 18:20:33.945150 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:33.945105 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" podUID="a2841e7a-1489-4ee2-867d-68ad0e8e8a68" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:33.945558 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:33.945219 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" Apr 16 18:20:38.945393 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:38.945355 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" podUID="a2841e7a-1489-4ee2-867d-68ad0e8e8a68" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:42.636704 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.636666 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6"] Apr 16 18:20:42.637105 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.637021 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerName="storage-initializer" Apr 16 18:20:42.637105 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.637034 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerName="storage-initializer" Apr 16 18:20:42.637105 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.637050 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="kserve-container" Apr 16 18:20:42.637105 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.637056 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="kserve-container" Apr 16 18:20:42.637105 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.637080 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="storage-initializer" Apr 16 18:20:42.637105 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.637085 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="storage-initializer" Apr 16 18:20:42.637105 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.637098 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerName="kserve-container" Apr 16 18:20:42.637105 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.637103 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerName="kserve-container" Apr 16 18:20:42.637390 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.637148 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="5bba13cd-dc2d-45fe-ba30-3cdccac702a1" containerName="kserve-container" Apr 16 18:20:42.637390 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.637157 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb6437f7-3993-4858-a222-e997b89c17a6" containerName="kserve-container" Apr 16 18:20:42.639884 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.639869 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:20:42.641894 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.641862 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-14a0c-serving-cert\"" Apr 16 18:20:42.641894 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.641866 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-14a0c-kube-rbac-proxy-sar-config\"" Apr 16 18:20:42.648177 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.648142 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6"] Apr 16 18:20:42.753345 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.753302 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-openshift-service-ca-bundle\") pod \"switch-graph-14a0c-57c765bd5b-44zs6\" (UID: \"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931\") " pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:20:42.753504 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.753384 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-proxy-tls\") pod \"switch-graph-14a0c-57c765bd5b-44zs6\" (UID: \"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931\") " pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:20:42.854546 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.854507 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-openshift-service-ca-bundle\") pod \"switch-graph-14a0c-57c765bd5b-44zs6\" (UID: \"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931\") " pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:20:42.854744 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.854599 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-proxy-tls\") pod \"switch-graph-14a0c-57c765bd5b-44zs6\" (UID: \"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931\") " pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:20:42.854744 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:20:42.854689 2583 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-14a0c-serving-cert: secret "switch-graph-14a0c-serving-cert" not found Apr 16 18:20:42.854823 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:20:42.854757 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-proxy-tls podName:e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:43.354739444 +0000 UTC m=+643.943937708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-proxy-tls") pod "switch-graph-14a0c-57c765bd5b-44zs6" (UID: "e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931") : secret "switch-graph-14a0c-serving-cert" not found Apr 16 18:20:42.855181 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:42.855164 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-openshift-service-ca-bundle\") pod \"switch-graph-14a0c-57c765bd5b-44zs6\" (UID: \"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931\") " pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:20:43.358502 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:43.358459 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-proxy-tls\") pod \"switch-graph-14a0c-57c765bd5b-44zs6\" (UID: \"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931\") " pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:20:43.361029 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:43.361007 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-proxy-tls\") pod \"switch-graph-14a0c-57c765bd5b-44zs6\" (UID: \"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931\") " pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:20:43.550538 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:43.550507 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:20:43.670144 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:43.670095 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6"] Apr 16 18:20:43.672845 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:20:43.672813 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1d2a89e_50bc_47ef_a4c9_9e0d6e5a4931.slice/crio-aea7994d3bb87046d48f6386cfeb6f316f6dcdd03e4ca7799d2a357874295d44 WatchSource:0}: Error finding container aea7994d3bb87046d48f6386cfeb6f316f6dcdd03e4ca7799d2a357874295d44: Status 404 returned error can't find the container with id aea7994d3bb87046d48f6386cfeb6f316f6dcdd03e4ca7799d2a357874295d44 Apr 16 18:20:43.944880 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:43.944847 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" podUID="a2841e7a-1489-4ee2-867d-68ad0e8e8a68" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:44.032992 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:44.032953 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" event={"ID":"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931","Type":"ContainerStarted","Data":"7174f8cdb4429671db4ab3ecdf2e1b0ab09eb7af15181dd80cf110f6cd1e707d"} Apr 16 18:20:44.032992 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:44.032991 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" event={"ID":"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931","Type":"ContainerStarted","Data":"aea7994d3bb87046d48f6386cfeb6f316f6dcdd03e4ca7799d2a357874295d44"} Apr 16 18:20:44.033239 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:44.033069 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:20:44.048445 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:44.048397 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" podStartSLOduration=2.048380777 podStartE2EDuration="2.048380777s" podCreationTimestamp="2026-04-16 18:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:44.048136477 +0000 UTC m=+644.637334764" watchObservedRunningTime="2026-04-16 18:20:44.048380777 +0000 UTC m=+644.637579063" Apr 16 18:20:48.944330 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:48.944295 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" podUID="a2841e7a-1489-4ee2-867d-68ad0e8e8a68" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:50.043162 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:50.043134 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:20:52.929349 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:52.929329 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" Apr 16 18:20:53.043886 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.043843 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2841e7a-1489-4ee2-867d-68ad0e8e8a68-openshift-service-ca-bundle\") pod \"a2841e7a-1489-4ee2-867d-68ad0e8e8a68\" (UID: \"a2841e7a-1489-4ee2-867d-68ad0e8e8a68\") " Apr 16 18:20:53.044062 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.043907 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2841e7a-1489-4ee2-867d-68ad0e8e8a68-proxy-tls\") pod \"a2841e7a-1489-4ee2-867d-68ad0e8e8a68\" (UID: \"a2841e7a-1489-4ee2-867d-68ad0e8e8a68\") " Apr 16 18:20:53.044192 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.044163 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2841e7a-1489-4ee2-867d-68ad0e8e8a68-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "a2841e7a-1489-4ee2-867d-68ad0e8e8a68" (UID: "a2841e7a-1489-4ee2-867d-68ad0e8e8a68"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:53.044625 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.044372 2583 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2841e7a-1489-4ee2-867d-68ad0e8e8a68-openshift-service-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:20:53.046183 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.046162 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2841e7a-1489-4ee2-867d-68ad0e8e8a68-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a2841e7a-1489-4ee2-867d-68ad0e8e8a68" (UID: "a2841e7a-1489-4ee2-867d-68ad0e8e8a68"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:53.060766 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.060694 2583 generic.go:358] "Generic (PLEG): container finished" podID="a2841e7a-1489-4ee2-867d-68ad0e8e8a68" containerID="98d5a38d356303ac2222565ff8955c9752b83dca5022dd943c9a56c75ce58ab2" exitCode=137 Apr 16 18:20:53.060766 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.060735 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" event={"ID":"a2841e7a-1489-4ee2-867d-68ad0e8e8a68","Type":"ContainerDied","Data":"98d5a38d356303ac2222565ff8955c9752b83dca5022dd943c9a56c75ce58ab2"} Apr 16 18:20:53.060766 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.060757 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" event={"ID":"a2841e7a-1489-4ee2-867d-68ad0e8e8a68","Type":"ContainerDied","Data":"99a78133e139db59e3442acd349a86f34ba40cdc9949eb846846cf6b712b82e0"} Apr 16 18:20:53.060981 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.060767 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns" Apr 16 18:20:53.060981 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.060779 2583 scope.go:117] "RemoveContainer" containerID="98d5a38d356303ac2222565ff8955c9752b83dca5022dd943c9a56c75ce58ab2" Apr 16 18:20:53.068930 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.068911 2583 scope.go:117] "RemoveContainer" containerID="98d5a38d356303ac2222565ff8955c9752b83dca5022dd943c9a56c75ce58ab2" Apr 16 18:20:53.069193 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:20:53.069173 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d5a38d356303ac2222565ff8955c9752b83dca5022dd943c9a56c75ce58ab2\": container with ID starting with 98d5a38d356303ac2222565ff8955c9752b83dca5022dd943c9a56c75ce58ab2 not found: ID does not exist" containerID="98d5a38d356303ac2222565ff8955c9752b83dca5022dd943c9a56c75ce58ab2" Apr 16 18:20:53.069240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.069202 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d5a38d356303ac2222565ff8955c9752b83dca5022dd943c9a56c75ce58ab2"} err="failed to get container status \"98d5a38d356303ac2222565ff8955c9752b83dca5022dd943c9a56c75ce58ab2\": rpc error: code = NotFound desc = could not find container \"98d5a38d356303ac2222565ff8955c9752b83dca5022dd943c9a56c75ce58ab2\": container with ID starting with 98d5a38d356303ac2222565ff8955c9752b83dca5022dd943c9a56c75ce58ab2 not found: ID does not exist" Apr 16 18:20:53.084090 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.084062 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns"] Apr 16 18:20:53.087727 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.087706 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-55559dfcf8-vvdns"] Apr 16 18:20:53.145767 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.145740 2583 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2841e7a-1489-4ee2-867d-68ad0e8e8a68-proxy-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:20:53.991750 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:20:53.991713 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2841e7a-1489-4ee2-867d-68ad0e8e8a68" path="/var/lib/kubelet/pods/a2841e7a-1489-4ee2-867d-68ad0e8e8a68/volumes" Apr 16 18:21:32.489379 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:32.489345 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz"] Apr 16 18:21:32.489889 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:32.489685 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2841e7a-1489-4ee2-867d-68ad0e8e8a68" containerName="model-chainer" Apr 16 18:21:32.489889 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:32.489697 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2841e7a-1489-4ee2-867d-68ad0e8e8a68" containerName="model-chainer" Apr 16 18:21:32.489889 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:32.489738 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2841e7a-1489-4ee2-867d-68ad0e8e8a68" containerName="model-chainer" Apr 16 18:21:32.492486 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:32.492471 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:21:32.494422 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:32.494404 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-68d6a-serving-cert\"" Apr 16 18:21:32.494511 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:32.494407 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-68d6a-kube-rbac-proxy-sar-config\"" Apr 16 18:21:32.501995 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:32.501963 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz"] Apr 16 18:21:32.572313 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:32.572278 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-openshift-service-ca-bundle\") pod \"sequence-graph-68d6a-5bfbb5d456-zkhmz\" (UID: \"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d\") " pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:21:32.572313 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:32.572315 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-proxy-tls\") pod \"sequence-graph-68d6a-5bfbb5d456-zkhmz\" (UID: \"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d\") " pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:21:32.672785 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:32.672749 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-openshift-service-ca-bundle\") pod \"sequence-graph-68d6a-5bfbb5d456-zkhmz\" (UID: \"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d\") " pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:21:32.672785 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:32.672788 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-proxy-tls\") pod \"sequence-graph-68d6a-5bfbb5d456-zkhmz\" (UID: \"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d\") " pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:21:32.673051 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:21:32.672881 2583 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-68d6a-serving-cert: secret "sequence-graph-68d6a-serving-cert" not found Apr 16 18:21:32.673051 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:21:32.672951 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-proxy-tls podName:cb05b91f-71b6-4c5c-ba47-b4c77cfa716d nodeName:}" failed. No retries permitted until 2026-04-16 18:21:33.172930857 +0000 UTC m=+693.762129123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-proxy-tls") pod "sequence-graph-68d6a-5bfbb5d456-zkhmz" (UID: "cb05b91f-71b6-4c5c-ba47-b4c77cfa716d") : secret "sequence-graph-68d6a-serving-cert" not found Apr 16 18:21:32.673339 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:32.673317 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-openshift-service-ca-bundle\") pod \"sequence-graph-68d6a-5bfbb5d456-zkhmz\" (UID: \"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d\") " pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:21:33.176524 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:33.176473 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-proxy-tls\") pod \"sequence-graph-68d6a-5bfbb5d456-zkhmz\" (UID: \"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d\") " pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:21:33.179033 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:33.179011 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-proxy-tls\") pod \"sequence-graph-68d6a-5bfbb5d456-zkhmz\" (UID: \"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d\") " pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:21:33.403154 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:33.403119 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:21:33.525658 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:33.525544 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz"] Apr 16 18:21:33.528536 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:21:33.528510 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb05b91f_71b6_4c5c_ba47_b4c77cfa716d.slice/crio-b84427d14b2fc8552bb4927b483cf5e3315e3377e4344a29d2aa2f27e3a46723 WatchSource:0}: Error finding container b84427d14b2fc8552bb4927b483cf5e3315e3377e4344a29d2aa2f27e3a46723: Status 404 returned error can't find the container with id b84427d14b2fc8552bb4927b483cf5e3315e3377e4344a29d2aa2f27e3a46723 Apr 16 18:21:34.189572 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:34.189533 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" event={"ID":"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d","Type":"ContainerStarted","Data":"473c5c74801a9a26a3eeb7de10065b07f03ff25adb6a7926ffb2f484aba1b5bd"} Apr 16 18:21:34.189572 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:34.189568 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" event={"ID":"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d","Type":"ContainerStarted","Data":"b84427d14b2fc8552bb4927b483cf5e3315e3377e4344a29d2aa2f27e3a46723"} Apr 16 18:21:34.189850 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:34.189667 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:21:34.204352 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:34.204300 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" podStartSLOduration=2.204288508 podStartE2EDuration="2.204288508s" podCreationTimestamp="2026-04-16 18:21:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:21:34.203098568 +0000 UTC m=+694.792296905" watchObservedRunningTime="2026-04-16 18:21:34.204288508 +0000 UTC m=+694.793486793" Apr 16 18:21:40.198471 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:21:40.198442 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:24:59.922857 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:24:59.922816 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:24:59.924223 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:24:59.924192 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:28:57.408705 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:28:57.408612 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6"] Apr 16 18:28:57.411376 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:28:57.408896 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" podUID="e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" containerName="switch-graph-14a0c" containerID="cri-o://7174f8cdb4429671db4ab3ecdf2e1b0ab09eb7af15181dd80cf110f6cd1e707d" gracePeriod=30 Apr 16 18:29:00.039923 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:00.039891 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" podUID="e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" containerName="switch-graph-14a0c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:29:05.040495 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:05.040447 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" podUID="e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" containerName="switch-graph-14a0c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:29:10.039719 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:10.039678 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" podUID="e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" containerName="switch-graph-14a0c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:29:10.040091 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:10.039790 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:29:15.040285 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:15.040194 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" podUID="e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" containerName="switch-graph-14a0c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:29:20.040476 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:20.040440 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" podUID="e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" containerName="switch-graph-14a0c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:29:25.040310 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:25.040275 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" podUID="e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" containerName="switch-graph-14a0c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:29:27.540979 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.540958 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:29:27.559809 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.559777 2583 generic.go:358] "Generic (PLEG): container finished" podID="e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" containerID="7174f8cdb4429671db4ab3ecdf2e1b0ab09eb7af15181dd80cf110f6cd1e707d" exitCode=0 Apr 16 18:29:27.559968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.559856 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" event={"ID":"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931","Type":"ContainerDied","Data":"7174f8cdb4429671db4ab3ecdf2e1b0ab09eb7af15181dd80cf110f6cd1e707d"} Apr 16 18:29:27.559968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.559866 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" Apr 16 18:29:27.559968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.559893 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6" event={"ID":"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931","Type":"ContainerDied","Data":"aea7994d3bb87046d48f6386cfeb6f316f6dcdd03e4ca7799d2a357874295d44"} Apr 16 18:29:27.559968 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.559910 2583 scope.go:117] "RemoveContainer" containerID="7174f8cdb4429671db4ab3ecdf2e1b0ab09eb7af15181dd80cf110f6cd1e707d" Apr 16 18:29:27.569334 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.569316 2583 scope.go:117] "RemoveContainer" containerID="7174f8cdb4429671db4ab3ecdf2e1b0ab09eb7af15181dd80cf110f6cd1e707d" Apr 16 18:29:27.569621 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:29:27.569596 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7174f8cdb4429671db4ab3ecdf2e1b0ab09eb7af15181dd80cf110f6cd1e707d\": container with ID starting with 7174f8cdb4429671db4ab3ecdf2e1b0ab09eb7af15181dd80cf110f6cd1e707d not found: ID does not exist" containerID="7174f8cdb4429671db4ab3ecdf2e1b0ab09eb7af15181dd80cf110f6cd1e707d" Apr 16 18:29:27.569717 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.569630 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7174f8cdb4429671db4ab3ecdf2e1b0ab09eb7af15181dd80cf110f6cd1e707d"} err="failed to get container status \"7174f8cdb4429671db4ab3ecdf2e1b0ab09eb7af15181dd80cf110f6cd1e707d\": rpc error: code = NotFound desc = could not find container \"7174f8cdb4429671db4ab3ecdf2e1b0ab09eb7af15181dd80cf110f6cd1e707d\": container with ID starting with 7174f8cdb4429671db4ab3ecdf2e1b0ab09eb7af15181dd80cf110f6cd1e707d not found: ID does not exist" Apr 16 18:29:27.594699 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.594676 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-proxy-tls\") pod \"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931\" (UID: \"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931\") " Apr 16 18:29:27.594818 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.594718 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-openshift-service-ca-bundle\") pod \"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931\" (UID: \"e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931\") " Apr 16 18:29:27.595089 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.595069 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" (UID: "e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:29:27.596843 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.596827 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" (UID: "e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:29:27.695476 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.695387 2583 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-proxy-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:29:27.695476 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.695420 2583 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931-openshift-service-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:29:27.879344 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.879316 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6"] Apr 16 18:29:27.882666 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.882643 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-14a0c-57c765bd5b-44zs6"] Apr 16 18:29:27.991808 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:27.991726 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" path="/var/lib/kubelet/pods/e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931/volumes" Apr 16 18:29:47.316078 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:47.316044 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz"] Apr 16 18:29:47.316693 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:47.316360 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" podUID="cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" containerName="sequence-graph-68d6a" containerID="cri-o://473c5c74801a9a26a3eeb7de10065b07f03ff25adb6a7926ffb2f484aba1b5bd" gracePeriod=30 Apr 16 18:29:50.197309 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:50.197274 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" podUID="cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" containerName="sequence-graph-68d6a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:29:55.197091 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:55.197057 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" podUID="cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" containerName="sequence-graph-68d6a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:29:59.945769 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:59.945744 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:29:59.946258 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:29:59.946240 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:30:00.197541 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:00.197453 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" podUID="cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" containerName="sequence-graph-68d6a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:30:00.197726 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:00.197572 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:30:05.198511 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:05.198466 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" podUID="cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" containerName="sequence-graph-68d6a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:30:07.712273 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.712197 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj"] Apr 16 18:30:07.712635 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.712499 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" containerName="switch-graph-14a0c" Apr 16 18:30:07.712635 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.712509 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" containerName="switch-graph-14a0c" Apr 16 18:30:07.712635 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.712572 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1d2a89e-50bc-47ef-a4c9-9e0d6e5a4931" containerName="switch-graph-14a0c" Apr 16 18:30:07.715605 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.715569 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" Apr 16 18:30:07.719642 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.719620 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-cc45f-kube-rbac-proxy-sar-config\"" Apr 16 18:30:07.719763 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.719626 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-cc45f-serving-cert\"" Apr 16 18:30:07.729063 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.729034 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj"] Apr 16 18:30:07.816606 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.816555 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/510fb515-99fa-4b73-b49c-227f18031b76-proxy-tls\") pod \"ensemble-graph-cc45f-cb59667b-gd9bj\" (UID: \"510fb515-99fa-4b73-b49c-227f18031b76\") " pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" Apr 16 18:30:07.816770 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.816623 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510fb515-99fa-4b73-b49c-227f18031b76-openshift-service-ca-bundle\") pod \"ensemble-graph-cc45f-cb59667b-gd9bj\" (UID: \"510fb515-99fa-4b73-b49c-227f18031b76\") " pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" Apr 16 18:30:07.917276 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.917241 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/510fb515-99fa-4b73-b49c-227f18031b76-proxy-tls\") pod \"ensemble-graph-cc45f-cb59667b-gd9bj\" (UID: \"510fb515-99fa-4b73-b49c-227f18031b76\") " pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" Apr 16 18:30:07.917276 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.917280 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510fb515-99fa-4b73-b49c-227f18031b76-openshift-service-ca-bundle\") pod \"ensemble-graph-cc45f-cb59667b-gd9bj\" (UID: \"510fb515-99fa-4b73-b49c-227f18031b76\") " pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" Apr 16 18:30:07.917914 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.917895 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510fb515-99fa-4b73-b49c-227f18031b76-openshift-service-ca-bundle\") pod \"ensemble-graph-cc45f-cb59667b-gd9bj\" (UID: \"510fb515-99fa-4b73-b49c-227f18031b76\") " pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" Apr 16 18:30:07.919819 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:07.919794 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/510fb515-99fa-4b73-b49c-227f18031b76-proxy-tls\") pod \"ensemble-graph-cc45f-cb59667b-gd9bj\" (UID: \"510fb515-99fa-4b73-b49c-227f18031b76\") " pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" Apr 16 18:30:08.025375 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:08.025284 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" Apr 16 18:30:08.156776 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:08.156752 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj"] Apr 16 18:30:08.158984 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:30:08.158957 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod510fb515_99fa_4b73_b49c_227f18031b76.slice/crio-ecfaec6a340b1e8a928de63bfa0a177cd52bd1fe1ba4ef0263ebdbaed3a27ec5 WatchSource:0}: Error finding container ecfaec6a340b1e8a928de63bfa0a177cd52bd1fe1ba4ef0263ebdbaed3a27ec5: Status 404 returned error can't find the container with id ecfaec6a340b1e8a928de63bfa0a177cd52bd1fe1ba4ef0263ebdbaed3a27ec5 Apr 16 18:30:08.160742 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:08.160728 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:30:08.683828 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:08.683792 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" event={"ID":"510fb515-99fa-4b73-b49c-227f18031b76","Type":"ContainerStarted","Data":"a808d060057e2b91b574a710ff0e3e6df7190e052151c95c0844fadb409f1453"} Apr 16 18:30:08.683828 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:08.683827 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" event={"ID":"510fb515-99fa-4b73-b49c-227f18031b76","Type":"ContainerStarted","Data":"ecfaec6a340b1e8a928de63bfa0a177cd52bd1fe1ba4ef0263ebdbaed3a27ec5"} Apr 16 18:30:08.684040 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:08.683852 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" Apr 16 18:30:08.707455 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:08.707413 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" podStartSLOduration=1.707397973 podStartE2EDuration="1.707397973s" podCreationTimestamp="2026-04-16 18:30:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:08.704761156 +0000 UTC m=+1209.293959443" watchObservedRunningTime="2026-04-16 18:30:08.707397973 +0000 UTC m=+1209.296596237" Apr 16 18:30:10.197498 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:10.197457 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" podUID="cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" containerName="sequence-graph-68d6a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:30:14.692282 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:14.692254 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" Apr 16 18:30:15.196626 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:15.196571 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" podUID="cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" containerName="sequence-graph-68d6a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:30:17.456780 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.456751 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:30:17.600401 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.600303 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-openshift-service-ca-bundle\") pod \"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d\" (UID: \"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d\") " Apr 16 18:30:17.600401 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.600363 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-proxy-tls\") pod \"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d\" (UID: \"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d\") " Apr 16 18:30:17.600684 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.600660 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" (UID: "cb05b91f-71b6-4c5c-ba47-b4c77cfa716d"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:30:17.602542 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.602520 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" (UID: "cb05b91f-71b6-4c5c-ba47-b4c77cfa716d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:30:17.701464 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.701425 2583 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-openshift-service-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:30:17.701464 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.701459 2583 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d-proxy-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:30:17.713811 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.713781 2583 generic.go:358] "Generic (PLEG): container finished" podID="cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" containerID="473c5c74801a9a26a3eeb7de10065b07f03ff25adb6a7926ffb2f484aba1b5bd" exitCode=0 Apr 16 18:30:17.713959 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.713833 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" event={"ID":"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d","Type":"ContainerDied","Data":"473c5c74801a9a26a3eeb7de10065b07f03ff25adb6a7926ffb2f484aba1b5bd"} Apr 16 18:30:17.713959 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.713845 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" Apr 16 18:30:17.713959 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.713856 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz" event={"ID":"cb05b91f-71b6-4c5c-ba47-b4c77cfa716d","Type":"ContainerDied","Data":"b84427d14b2fc8552bb4927b483cf5e3315e3377e4344a29d2aa2f27e3a46723"} Apr 16 18:30:17.713959 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.713871 2583 scope.go:117] "RemoveContainer" containerID="473c5c74801a9a26a3eeb7de10065b07f03ff25adb6a7926ffb2f484aba1b5bd" Apr 16 18:30:17.722272 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.722242 2583 scope.go:117] "RemoveContainer" containerID="473c5c74801a9a26a3eeb7de10065b07f03ff25adb6a7926ffb2f484aba1b5bd" Apr 16 18:30:17.722542 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:30:17.722522 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"473c5c74801a9a26a3eeb7de10065b07f03ff25adb6a7926ffb2f484aba1b5bd\": container with ID starting with 473c5c74801a9a26a3eeb7de10065b07f03ff25adb6a7926ffb2f484aba1b5bd not found: ID does not exist" containerID="473c5c74801a9a26a3eeb7de10065b07f03ff25adb6a7926ffb2f484aba1b5bd" Apr 16 18:30:17.722624 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.722551 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"473c5c74801a9a26a3eeb7de10065b07f03ff25adb6a7926ffb2f484aba1b5bd"} err="failed to get container status \"473c5c74801a9a26a3eeb7de10065b07f03ff25adb6a7926ffb2f484aba1b5bd\": rpc error: code = NotFound desc = could not find container \"473c5c74801a9a26a3eeb7de10065b07f03ff25adb6a7926ffb2f484aba1b5bd\": container with ID starting with 473c5c74801a9a26a3eeb7de10065b07f03ff25adb6a7926ffb2f484aba1b5bd not found: ID does not exist" Apr 16 18:30:17.743460 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.743429 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz"] Apr 16 18:30:17.759342 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.759308 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-68d6a-5bfbb5d456-zkhmz"] Apr 16 18:30:17.826938 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.826909 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj"] Apr 16 18:30:17.827163 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.827142 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" podUID="510fb515-99fa-4b73-b49c-227f18031b76" containerName="ensemble-graph-cc45f" containerID="cri-o://a808d060057e2b91b574a710ff0e3e6df7190e052151c95c0844fadb409f1453" gracePeriod=30 Apr 16 18:30:17.992041 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:17.992009 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" path="/var/lib/kubelet/pods/cb05b91f-71b6-4c5c-ba47-b4c77cfa716d/volumes" Apr 16 18:30:19.691437 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:19.691393 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" podUID="510fb515-99fa-4b73-b49c-227f18031b76" containerName="ensemble-graph-cc45f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:30:24.691201 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:24.691164 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" podUID="510fb515-99fa-4b73-b49c-227f18031b76" containerName="ensemble-graph-cc45f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:30:29.691624 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:29.691562 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" podUID="510fb515-99fa-4b73-b49c-227f18031b76" containerName="ensemble-graph-cc45f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:30:29.692061 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:29.691723 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" Apr 16 18:30:34.691018 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:34.690983 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" podUID="510fb515-99fa-4b73-b49c-227f18031b76" containerName="ensemble-graph-cc45f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:30:39.691463 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:39.691424 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" podUID="510fb515-99fa-4b73-b49c-227f18031b76" containerName="ensemble-graph-cc45f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:30:44.691211 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:44.691121 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" podUID="510fb515-99fa-4b73-b49c-227f18031b76" containerName="ensemble-graph-cc45f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:30:47.996065 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:47.996038 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" Apr 16 18:30:48.041641 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.041604 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/510fb515-99fa-4b73-b49c-227f18031b76-proxy-tls\") pod \"510fb515-99fa-4b73-b49c-227f18031b76\" (UID: \"510fb515-99fa-4b73-b49c-227f18031b76\") " Apr 16 18:30:48.041818 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.041694 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510fb515-99fa-4b73-b49c-227f18031b76-openshift-service-ca-bundle\") pod \"510fb515-99fa-4b73-b49c-227f18031b76\" (UID: \"510fb515-99fa-4b73-b49c-227f18031b76\") " Apr 16 18:30:48.042086 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.042058 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/510fb515-99fa-4b73-b49c-227f18031b76-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "510fb515-99fa-4b73-b49c-227f18031b76" (UID: "510fb515-99fa-4b73-b49c-227f18031b76"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:30:48.044049 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.044008 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/510fb515-99fa-4b73-b49c-227f18031b76-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "510fb515-99fa-4b73-b49c-227f18031b76" (UID: "510fb515-99fa-4b73-b49c-227f18031b76"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:30:48.142997 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.142961 2583 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510fb515-99fa-4b73-b49c-227f18031b76-openshift-service-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:30:48.142997 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.142993 2583 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/510fb515-99fa-4b73-b49c-227f18031b76-proxy-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:30:48.803838 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.803803 2583 generic.go:358] "Generic (PLEG): container finished" podID="510fb515-99fa-4b73-b49c-227f18031b76" containerID="a808d060057e2b91b574a710ff0e3e6df7190e052151c95c0844fadb409f1453" exitCode=0 Apr 16 18:30:48.804035 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.803867 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" Apr 16 18:30:48.804035 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.803883 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" event={"ID":"510fb515-99fa-4b73-b49c-227f18031b76","Type":"ContainerDied","Data":"a808d060057e2b91b574a710ff0e3e6df7190e052151c95c0844fadb409f1453"} Apr 16 18:30:48.804035 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.803917 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj" event={"ID":"510fb515-99fa-4b73-b49c-227f18031b76","Type":"ContainerDied","Data":"ecfaec6a340b1e8a928de63bfa0a177cd52bd1fe1ba4ef0263ebdbaed3a27ec5"} Apr 16 18:30:48.804035 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.803934 2583 scope.go:117] "RemoveContainer" containerID="a808d060057e2b91b574a710ff0e3e6df7190e052151c95c0844fadb409f1453" Apr 16 18:30:48.812244 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.812229 2583 scope.go:117] "RemoveContainer" containerID="a808d060057e2b91b574a710ff0e3e6df7190e052151c95c0844fadb409f1453" Apr 16 18:30:48.812525 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:30:48.812505 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a808d060057e2b91b574a710ff0e3e6df7190e052151c95c0844fadb409f1453\": container with ID starting with a808d060057e2b91b574a710ff0e3e6df7190e052151c95c0844fadb409f1453 not found: ID does not exist" containerID="a808d060057e2b91b574a710ff0e3e6df7190e052151c95c0844fadb409f1453" Apr 16 18:30:48.812566 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.812533 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a808d060057e2b91b574a710ff0e3e6df7190e052151c95c0844fadb409f1453"} err="failed to get container status \"a808d060057e2b91b574a710ff0e3e6df7190e052151c95c0844fadb409f1453\": rpc error: code = NotFound desc = could not find container \"a808d060057e2b91b574a710ff0e3e6df7190e052151c95c0844fadb409f1453\": container with ID starting with a808d060057e2b91b574a710ff0e3e6df7190e052151c95c0844fadb409f1453 not found: ID does not exist" Apr 16 18:30:48.831262 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.831226 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj"] Apr 16 18:30:48.842542 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:48.842507 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-cc45f-cb59667b-gd9bj"] Apr 16 18:30:49.991677 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:49.991637 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="510fb515-99fa-4b73-b49c-227f18031b76" path="/var/lib/kubelet/pods/510fb515-99fa-4b73-b49c-227f18031b76/volumes" Apr 16 18:30:57.640164 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.640132 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv"] Apr 16 18:30:57.640627 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.640444 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" containerName="sequence-graph-68d6a" Apr 16 18:30:57.640627 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.640456 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" containerName="sequence-graph-68d6a" Apr 16 18:30:57.640627 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.640465 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="510fb515-99fa-4b73-b49c-227f18031b76" containerName="ensemble-graph-cc45f" Apr 16 18:30:57.640627 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.640470 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="510fb515-99fa-4b73-b49c-227f18031b76" containerName="ensemble-graph-cc45f" Apr 16 18:30:57.640627 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.640513 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb05b91f-71b6-4c5c-ba47-b4c77cfa716d" containerName="sequence-graph-68d6a" Apr 16 18:30:57.640627 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.640523 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="510fb515-99fa-4b73-b49c-227f18031b76" containerName="ensemble-graph-cc45f" Apr 16 18:30:57.643418 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.643402 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" Apr 16 18:30:57.649403 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.649373 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-c659a-serving-cert\"" Apr 16 18:30:57.650117 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.650093 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7d9sz\"" Apr 16 18:30:57.650231 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.650169 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-c659a-kube-rbac-proxy-sar-config\"" Apr 16 18:30:57.650294 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.650253 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:30:57.676133 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.676103 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv"] Apr 16 18:30:57.724858 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.724828 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d259fc2-8650-4916-b45f-ade258b42eb4-openshift-service-ca-bundle\") pod \"sequence-graph-c659a-cd9d4c6d8-d4jbv\" (UID: \"2d259fc2-8650-4916-b45f-ade258b42eb4\") " pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" Apr 16 18:30:57.725034 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.724898 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d259fc2-8650-4916-b45f-ade258b42eb4-proxy-tls\") pod \"sequence-graph-c659a-cd9d4c6d8-d4jbv\" (UID: \"2d259fc2-8650-4916-b45f-ade258b42eb4\") " pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" Apr 16 18:30:57.825663 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.825628 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d259fc2-8650-4916-b45f-ade258b42eb4-openshift-service-ca-bundle\") pod \"sequence-graph-c659a-cd9d4c6d8-d4jbv\" (UID: \"2d259fc2-8650-4916-b45f-ade258b42eb4\") " pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" Apr 16 18:30:57.825835 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.825693 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d259fc2-8650-4916-b45f-ade258b42eb4-proxy-tls\") pod \"sequence-graph-c659a-cd9d4c6d8-d4jbv\" (UID: \"2d259fc2-8650-4916-b45f-ade258b42eb4\") " pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" Apr 16 18:30:57.826317 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.826290 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d259fc2-8650-4916-b45f-ade258b42eb4-openshift-service-ca-bundle\") pod \"sequence-graph-c659a-cd9d4c6d8-d4jbv\" (UID: \"2d259fc2-8650-4916-b45f-ade258b42eb4\") " pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" Apr 16 18:30:57.828271 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.828254 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d259fc2-8650-4916-b45f-ade258b42eb4-proxy-tls\") pod \"sequence-graph-c659a-cd9d4c6d8-d4jbv\" (UID: \"2d259fc2-8650-4916-b45f-ade258b42eb4\") " pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" Apr 16 18:30:57.953715 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:57.953616 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" Apr 16 18:30:58.092286 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:58.092159 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv"] Apr 16 18:30:58.094492 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:30:58.094460 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d259fc2_8650_4916_b45f_ade258b42eb4.slice/crio-a7d903bafcd2cf171d9589165d4c0ac6fbb4d0553627e14ea521640694b886f6 WatchSource:0}: Error finding container a7d903bafcd2cf171d9589165d4c0ac6fbb4d0553627e14ea521640694b886f6: Status 404 returned error can't find the container with id a7d903bafcd2cf171d9589165d4c0ac6fbb4d0553627e14ea521640694b886f6 Apr 16 18:30:58.839448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:58.839414 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" event={"ID":"2d259fc2-8650-4916-b45f-ade258b42eb4","Type":"ContainerStarted","Data":"400fe7e32d0715229a4aa43fdb1d03e8b3a6c375f9b3bdb304ace4e065d6d42b"} Apr 16 18:30:58.839448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:58.839446 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" event={"ID":"2d259fc2-8650-4916-b45f-ade258b42eb4","Type":"ContainerStarted","Data":"a7d903bafcd2cf171d9589165d4c0ac6fbb4d0553627e14ea521640694b886f6"} Apr 16 18:30:58.839980 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:58.839472 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" Apr 16 18:30:58.867407 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:30:58.867357 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" podStartSLOduration=1.867343682 podStartE2EDuration="1.867343682s" podCreationTimestamp="2026-04-16 18:30:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:58.864849756 +0000 UTC m=+1259.454048042" watchObservedRunningTime="2026-04-16 18:30:58.867343682 +0000 UTC m=+1259.456541968" Apr 16 18:31:04.847818 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:04.847788 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" Apr 16 18:31:07.723195 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:07.723103 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv"] Apr 16 18:31:07.723622 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:07.723321 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" podUID="2d259fc2-8650-4916-b45f-ade258b42eb4" containerName="sequence-graph-c659a" containerID="cri-o://400fe7e32d0715229a4aa43fdb1d03e8b3a6c375f9b3bdb304ace4e065d6d42b" gracePeriod=30 Apr 16 18:31:09.846520 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:09.846485 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" podUID="2d259fc2-8650-4916-b45f-ade258b42eb4" containerName="sequence-graph-c659a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:14.846294 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:14.846254 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" podUID="2d259fc2-8650-4916-b45f-ade258b42eb4" containerName="sequence-graph-c659a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:19.846283 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:19.846240 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" podUID="2d259fc2-8650-4916-b45f-ade258b42eb4" containerName="sequence-graph-c659a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:19.846754 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:19.846380 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" Apr 16 18:31:24.845962 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:24.845917 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" podUID="2d259fc2-8650-4916-b45f-ade258b42eb4" containerName="sequence-graph-c659a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:28.142194 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:28.142157 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw"] Apr 16 18:31:28.145449 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:28.145426 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:31:28.151046 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:28.151024 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-bac14-kube-rbac-proxy-sar-config\"" Apr 16 18:31:28.151183 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:28.151029 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-bac14-serving-cert\"" Apr 16 18:31:28.160616 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:28.160593 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw"] Apr 16 18:31:28.180270 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:28.180243 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f90f449-829f-41cb-822c-83380f6743bc-proxy-tls\") pod \"ensemble-graph-bac14-959f8bf68-g78lw\" (UID: \"9f90f449-829f-41cb-822c-83380f6743bc\") " pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:31:28.180390 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:28.180282 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f90f449-829f-41cb-822c-83380f6743bc-openshift-service-ca-bundle\") pod \"ensemble-graph-bac14-959f8bf68-g78lw\" (UID: \"9f90f449-829f-41cb-822c-83380f6743bc\") " pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:31:28.281010 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:28.280976 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f90f449-829f-41cb-822c-83380f6743bc-proxy-tls\") pod \"ensemble-graph-bac14-959f8bf68-g78lw\" (UID: \"9f90f449-829f-41cb-822c-83380f6743bc\") " pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:31:28.281272 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:28.281031 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f90f449-829f-41cb-822c-83380f6743bc-openshift-service-ca-bundle\") pod \"ensemble-graph-bac14-959f8bf68-g78lw\" (UID: \"9f90f449-829f-41cb-822c-83380f6743bc\") " pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:31:28.281272 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:31:28.281121 2583 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-bac14-serving-cert: secret "ensemble-graph-bac14-serving-cert" not found Apr 16 18:31:28.281272 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:31:28.281184 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f90f449-829f-41cb-822c-83380f6743bc-proxy-tls podName:9f90f449-829f-41cb-822c-83380f6743bc nodeName:}" failed. No retries permitted until 2026-04-16 18:31:28.781168311 +0000 UTC m=+1289.370366578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9f90f449-829f-41cb-822c-83380f6743bc-proxy-tls") pod "ensemble-graph-bac14-959f8bf68-g78lw" (UID: "9f90f449-829f-41cb-822c-83380f6743bc") : secret "ensemble-graph-bac14-serving-cert" not found Apr 16 18:31:28.281682 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:28.281665 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f90f449-829f-41cb-822c-83380f6743bc-openshift-service-ca-bundle\") pod \"ensemble-graph-bac14-959f8bf68-g78lw\" (UID: \"9f90f449-829f-41cb-822c-83380f6743bc\") " pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:31:28.783831 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:28.783799 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f90f449-829f-41cb-822c-83380f6743bc-proxy-tls\") pod \"ensemble-graph-bac14-959f8bf68-g78lw\" (UID: \"9f90f449-829f-41cb-822c-83380f6743bc\") " pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:31:28.786287 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:28.786264 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f90f449-829f-41cb-822c-83380f6743bc-proxy-tls\") pod \"ensemble-graph-bac14-959f8bf68-g78lw\" (UID: \"9f90f449-829f-41cb-822c-83380f6743bc\") " pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:31:29.055384 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:29.055300 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:31:29.178118 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:29.178086 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw"] Apr 16 18:31:29.181371 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:31:29.181343 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f90f449_829f_41cb_822c_83380f6743bc.slice/crio-b05365d93e5625c880ff01126d4ce8e847a0a9685e51bb53dad73ca93b2e5a6a WatchSource:0}: Error finding container b05365d93e5625c880ff01126d4ce8e847a0a9685e51bb53dad73ca93b2e5a6a: Status 404 returned error can't find the container with id b05365d93e5625c880ff01126d4ce8e847a0a9685e51bb53dad73ca93b2e5a6a Apr 16 18:31:29.845787 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:29.845745 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" podUID="2d259fc2-8650-4916-b45f-ade258b42eb4" containerName="sequence-graph-c659a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:29.929970 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:29.929936 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" event={"ID":"9f90f449-829f-41cb-822c-83380f6743bc","Type":"ContainerStarted","Data":"73fa6c445d6ec2c1e9932a50ada0142c57d8756b31504925a883679896290f4d"} Apr 16 18:31:29.929970 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:29.929972 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" event={"ID":"9f90f449-829f-41cb-822c-83380f6743bc","Type":"ContainerStarted","Data":"b05365d93e5625c880ff01126d4ce8e847a0a9685e51bb53dad73ca93b2e5a6a"} Apr 16 18:31:29.930157 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:29.930052 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:31:29.948862 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:29.948800 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" podStartSLOduration=1.948784386 podStartE2EDuration="1.948784386s" podCreationTimestamp="2026-04-16 18:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:31:29.947661126 +0000 UTC m=+1290.536859409" watchObservedRunningTime="2026-04-16 18:31:29.948784386 +0000 UTC m=+1290.537982672" Apr 16 18:31:34.845626 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:34.845569 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" podUID="2d259fc2-8650-4916-b45f-ade258b42eb4" containerName="sequence-graph-c659a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:35.939202 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:35.939169 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:31:37.858222 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:37.858196 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" Apr 16 18:31:37.955501 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:37.953533 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d259fc2-8650-4916-b45f-ade258b42eb4-proxy-tls\") pod \"2d259fc2-8650-4916-b45f-ade258b42eb4\" (UID: \"2d259fc2-8650-4916-b45f-ade258b42eb4\") " Apr 16 18:31:37.955501 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:37.953682 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d259fc2-8650-4916-b45f-ade258b42eb4-openshift-service-ca-bundle\") pod \"2d259fc2-8650-4916-b45f-ade258b42eb4\" (UID: \"2d259fc2-8650-4916-b45f-ade258b42eb4\") " Apr 16 18:31:37.955501 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:37.954377 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d259fc2-8650-4916-b45f-ade258b42eb4-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "2d259fc2-8650-4916-b45f-ade258b42eb4" (UID: "2d259fc2-8650-4916-b45f-ade258b42eb4"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:31:37.958017 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:37.957983 2583 generic.go:358] "Generic (PLEG): container finished" podID="2d259fc2-8650-4916-b45f-ade258b42eb4" containerID="400fe7e32d0715229a4aa43fdb1d03e8b3a6c375f9b3bdb304ace4e065d6d42b" exitCode=0 Apr 16 18:31:37.958170 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:37.958082 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" event={"ID":"2d259fc2-8650-4916-b45f-ade258b42eb4","Type":"ContainerDied","Data":"400fe7e32d0715229a4aa43fdb1d03e8b3a6c375f9b3bdb304ace4e065d6d42b"} Apr 16 18:31:37.958170 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:37.958114 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" event={"ID":"2d259fc2-8650-4916-b45f-ade258b42eb4","Type":"ContainerDied","Data":"a7d903bafcd2cf171d9589165d4c0ac6fbb4d0553627e14ea521640694b886f6"} Apr 16 18:31:37.958170 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:37.958121 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv" Apr 16 18:31:37.958349 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:37.958131 2583 scope.go:117] "RemoveContainer" containerID="400fe7e32d0715229a4aa43fdb1d03e8b3a6c375f9b3bdb304ace4e065d6d42b" Apr 16 18:31:37.958663 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:37.958638 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d259fc2-8650-4916-b45f-ade258b42eb4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2d259fc2-8650-4916-b45f-ade258b42eb4" (UID: "2d259fc2-8650-4916-b45f-ade258b42eb4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:31:37.967604 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:37.967565 2583 scope.go:117] "RemoveContainer" containerID="400fe7e32d0715229a4aa43fdb1d03e8b3a6c375f9b3bdb304ace4e065d6d42b" Apr 16 18:31:37.967884 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:31:37.967865 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"400fe7e32d0715229a4aa43fdb1d03e8b3a6c375f9b3bdb304ace4e065d6d42b\": container with ID starting with 400fe7e32d0715229a4aa43fdb1d03e8b3a6c375f9b3bdb304ace4e065d6d42b not found: ID does not exist" containerID="400fe7e32d0715229a4aa43fdb1d03e8b3a6c375f9b3bdb304ace4e065d6d42b" Apr 16 18:31:37.967931 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:37.967893 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400fe7e32d0715229a4aa43fdb1d03e8b3a6c375f9b3bdb304ace4e065d6d42b"} err="failed to get container status \"400fe7e32d0715229a4aa43fdb1d03e8b3a6c375f9b3bdb304ace4e065d6d42b\": rpc error: code = NotFound desc = could not find container \"400fe7e32d0715229a4aa43fdb1d03e8b3a6c375f9b3bdb304ace4e065d6d42b\": container with ID starting with 400fe7e32d0715229a4aa43fdb1d03e8b3a6c375f9b3bdb304ace4e065d6d42b not found: ID does not exist" Apr 16 18:31:38.054892 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:38.054852 2583 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d259fc2-8650-4916-b45f-ade258b42eb4-openshift-service-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:31:38.055060 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:38.054975 2583 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d259fc2-8650-4916-b45f-ade258b42eb4-proxy-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:31:38.278264 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:38.278191 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv"] Apr 16 18:31:38.282674 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:38.282651 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c659a-cd9d4c6d8-d4jbv"] Apr 16 18:31:39.990815 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:31:39.990782 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d259fc2-8650-4916-b45f-ade258b42eb4" path="/var/lib/kubelet/pods/2d259fc2-8650-4916-b45f-ade258b42eb4/volumes" Apr 16 18:32:18.018767 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.018684 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf"] Apr 16 18:32:18.019168 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.019010 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d259fc2-8650-4916-b45f-ade258b42eb4" containerName="sequence-graph-c659a" Apr 16 18:32:18.019168 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.019023 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d259fc2-8650-4916-b45f-ade258b42eb4" containerName="sequence-graph-c659a" Apr 16 18:32:18.019168 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.019090 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d259fc2-8650-4916-b45f-ade258b42eb4" containerName="sequence-graph-c659a" Apr 16 18:32:18.022879 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.022861 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" Apr 16 18:32:18.025044 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.025020 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-e9680-kube-rbac-proxy-sar-config\"" Apr 16 18:32:18.025122 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.025107 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-e9680-serving-cert\"" Apr 16 18:32:18.032058 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.032034 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf"] Apr 16 18:32:18.187089 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.187060 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6db6f19-b698-45c4-8a3e-77fe8b193239-proxy-tls\") pod \"sequence-graph-e9680-6cdfbbbc6f-c6qcf\" (UID: \"b6db6f19-b698-45c4-8a3e-77fe8b193239\") " pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" Apr 16 18:32:18.187269 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.187105 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6db6f19-b698-45c4-8a3e-77fe8b193239-openshift-service-ca-bundle\") pod \"sequence-graph-e9680-6cdfbbbc6f-c6qcf\" (UID: \"b6db6f19-b698-45c4-8a3e-77fe8b193239\") " pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" Apr 16 18:32:18.288465 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.288366 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6db6f19-b698-45c4-8a3e-77fe8b193239-proxy-tls\") pod \"sequence-graph-e9680-6cdfbbbc6f-c6qcf\" (UID: \"b6db6f19-b698-45c4-8a3e-77fe8b193239\") " pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" Apr 16 18:32:18.288465 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.288420 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6db6f19-b698-45c4-8a3e-77fe8b193239-openshift-service-ca-bundle\") pod \"sequence-graph-e9680-6cdfbbbc6f-c6qcf\" (UID: \"b6db6f19-b698-45c4-8a3e-77fe8b193239\") " pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" Apr 16 18:32:18.289026 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.289005 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6db6f19-b698-45c4-8a3e-77fe8b193239-openshift-service-ca-bundle\") pod \"sequence-graph-e9680-6cdfbbbc6f-c6qcf\" (UID: \"b6db6f19-b698-45c4-8a3e-77fe8b193239\") " pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" Apr 16 18:32:18.290971 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.290945 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6db6f19-b698-45c4-8a3e-77fe8b193239-proxy-tls\") pod \"sequence-graph-e9680-6cdfbbbc6f-c6qcf\" (UID: \"b6db6f19-b698-45c4-8a3e-77fe8b193239\") " pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" Apr 16 18:32:18.333596 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.333531 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" Apr 16 18:32:18.453870 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:18.453839 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf"] Apr 16 18:32:18.457243 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:32:18.457217 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6db6f19_b698_45c4_8a3e_77fe8b193239.slice/crio-2eba46d22669c2417798d67dafa2143dd6746a2fcf061224edf1d3e05d3b11cb WatchSource:0}: Error finding container 2eba46d22669c2417798d67dafa2143dd6746a2fcf061224edf1d3e05d3b11cb: Status 404 returned error can't find the container with id 2eba46d22669c2417798d67dafa2143dd6746a2fcf061224edf1d3e05d3b11cb Apr 16 18:32:19.081105 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:19.081072 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" event={"ID":"b6db6f19-b698-45c4-8a3e-77fe8b193239","Type":"ContainerStarted","Data":"6f0923837686b63447d32594bccdaea25a2202967373f24098329722be6dbc1e"} Apr 16 18:32:19.081105 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:19.081107 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" event={"ID":"b6db6f19-b698-45c4-8a3e-77fe8b193239","Type":"ContainerStarted","Data":"2eba46d22669c2417798d67dafa2143dd6746a2fcf061224edf1d3e05d3b11cb"} Apr 16 18:32:19.081534 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:19.081197 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" Apr 16 18:32:19.098606 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:19.098542 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" podStartSLOduration=1.098529865 podStartE2EDuration="1.098529865s" podCreationTimestamp="2026-04-16 18:32:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:32:19.097411595 +0000 UTC m=+1339.686609894" watchObservedRunningTime="2026-04-16 18:32:19.098529865 +0000 UTC m=+1339.687728150" Apr 16 18:32:25.089796 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:32:25.089765 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" Apr 16 18:34:59.969863 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:34:59.969838 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:34:59.970374 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:34:59.970091 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:39:42.712100 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:39:42.712017 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw"] Apr 16 18:39:42.712725 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:39:42.712318 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" podUID="9f90f449-829f-41cb-822c-83380f6743bc" containerName="ensemble-graph-bac14" containerID="cri-o://73fa6c445d6ec2c1e9932a50ada0142c57d8756b31504925a883679896290f4d" gracePeriod=30 Apr 16 18:39:45.937928 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:39:45.937886 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" podUID="9f90f449-829f-41cb-822c-83380f6743bc" containerName="ensemble-graph-bac14" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:39:50.937240 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:39:50.937195 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" podUID="9f90f449-829f-41cb-822c-83380f6743bc" containerName="ensemble-graph-bac14" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:39:55.937925 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:39:55.937885 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" podUID="9f90f449-829f-41cb-822c-83380f6743bc" containerName="ensemble-graph-bac14" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:39:55.938314 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:39:55.938024 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:39:59.992229 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:39:59.992201 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:39:59.993526 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:39:59.993508 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:40:00.937868 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:00.937833 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" podUID="9f90f449-829f-41cb-822c-83380f6743bc" containerName="ensemble-graph-bac14" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:40:05.937679 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:05.937642 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" podUID="9f90f449-829f-41cb-822c-83380f6743bc" containerName="ensemble-graph-bac14" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:40:10.937840 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:10.937793 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" podUID="9f90f449-829f-41cb-822c-83380f6743bc" containerName="ensemble-graph-bac14" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:40:13.350528 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.350506 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:40:13.442494 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.442458 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f90f449-829f-41cb-822c-83380f6743bc-proxy-tls\") pod \"9f90f449-829f-41cb-822c-83380f6743bc\" (UID: \"9f90f449-829f-41cb-822c-83380f6743bc\") " Apr 16 18:40:13.442694 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.442554 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f90f449-829f-41cb-822c-83380f6743bc-openshift-service-ca-bundle\") pod \"9f90f449-829f-41cb-822c-83380f6743bc\" (UID: \"9f90f449-829f-41cb-822c-83380f6743bc\") " Apr 16 18:40:13.442978 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.442952 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f90f449-829f-41cb-822c-83380f6743bc-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9f90f449-829f-41cb-822c-83380f6743bc" (UID: "9f90f449-829f-41cb-822c-83380f6743bc"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:40:13.444892 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.444865 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f90f449-829f-41cb-822c-83380f6743bc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9f90f449-829f-41cb-822c-83380f6743bc" (UID: "9f90f449-829f-41cb-822c-83380f6743bc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:40:13.445823 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.445796 2583 generic.go:358] "Generic (PLEG): container finished" podID="9f90f449-829f-41cb-822c-83380f6743bc" containerID="73fa6c445d6ec2c1e9932a50ada0142c57d8756b31504925a883679896290f4d" exitCode=0 Apr 16 18:40:13.445946 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.445861 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" Apr 16 18:40:13.445946 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.445883 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" event={"ID":"9f90f449-829f-41cb-822c-83380f6743bc","Type":"ContainerDied","Data":"73fa6c445d6ec2c1e9932a50ada0142c57d8756b31504925a883679896290f4d"} Apr 16 18:40:13.445946 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.445922 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw" event={"ID":"9f90f449-829f-41cb-822c-83380f6743bc","Type":"ContainerDied","Data":"b05365d93e5625c880ff01126d4ce8e847a0a9685e51bb53dad73ca93b2e5a6a"} Apr 16 18:40:13.445946 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.445937 2583 scope.go:117] "RemoveContainer" containerID="73fa6c445d6ec2c1e9932a50ada0142c57d8756b31504925a883679896290f4d" Apr 16 18:40:13.454754 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.454738 2583 scope.go:117] "RemoveContainer" containerID="73fa6c445d6ec2c1e9932a50ada0142c57d8756b31504925a883679896290f4d" Apr 16 18:40:13.455025 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:40:13.455005 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73fa6c445d6ec2c1e9932a50ada0142c57d8756b31504925a883679896290f4d\": container with ID starting with 73fa6c445d6ec2c1e9932a50ada0142c57d8756b31504925a883679896290f4d not found: ID does not exist" containerID="73fa6c445d6ec2c1e9932a50ada0142c57d8756b31504925a883679896290f4d" Apr 16 18:40:13.455076 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.455035 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73fa6c445d6ec2c1e9932a50ada0142c57d8756b31504925a883679896290f4d"} err="failed to get container status \"73fa6c445d6ec2c1e9932a50ada0142c57d8756b31504925a883679896290f4d\": rpc error: code = NotFound desc = could not find container \"73fa6c445d6ec2c1e9932a50ada0142c57d8756b31504925a883679896290f4d\": container with ID starting with 73fa6c445d6ec2c1e9932a50ada0142c57d8756b31504925a883679896290f4d not found: ID does not exist" Apr 16 18:40:13.466003 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.465973 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw"] Apr 16 18:40:13.470002 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.469982 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-bac14-959f8bf68-g78lw"] Apr 16 18:40:13.543808 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.543738 2583 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f90f449-829f-41cb-822c-83380f6743bc-openshift-service-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:40:13.543808 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.543769 2583 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f90f449-829f-41cb-822c-83380f6743bc-proxy-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:40:13.991615 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:13.991563 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f90f449-829f-41cb-822c-83380f6743bc" path="/var/lib/kubelet/pods/9f90f449-829f-41cb-822c-83380f6743bc/volumes" Apr 16 18:40:32.585821 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:32.585780 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf"] Apr 16 18:40:32.586303 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:32.586024 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" podUID="b6db6f19-b698-45c4-8a3e-77fe8b193239" containerName="sequence-graph-e9680" containerID="cri-o://6f0923837686b63447d32594bccdaea25a2202967373f24098329722be6dbc1e" gracePeriod=30 Apr 16 18:40:35.088816 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:35.088780 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" podUID="b6db6f19-b698-45c4-8a3e-77fe8b193239" containerName="sequence-graph-e9680" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:40:40.088540 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:40.088502 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" podUID="b6db6f19-b698-45c4-8a3e-77fe8b193239" containerName="sequence-graph-e9680" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:40:45.088590 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:45.088538 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" podUID="b6db6f19-b698-45c4-8a3e-77fe8b193239" containerName="sequence-graph-e9680" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:40:45.088986 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:45.088666 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" Apr 16 18:40:50.088957 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:50.088920 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" podUID="b6db6f19-b698-45c4-8a3e-77fe8b193239" containerName="sequence-graph-e9680" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:40:52.993814 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:52.993736 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m"] Apr 16 18:40:52.994229 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:52.994057 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f90f449-829f-41cb-822c-83380f6743bc" containerName="ensemble-graph-bac14" Apr 16 18:40:52.994229 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:52.994068 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f90f449-829f-41cb-822c-83380f6743bc" containerName="ensemble-graph-bac14" Apr 16 18:40:52.994229 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:52.994135 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f90f449-829f-41cb-822c-83380f6743bc" containerName="ensemble-graph-bac14" Apr 16 18:40:52.996951 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:52.996935 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:40:52.998913 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:52.998885 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-e04f1-kube-rbac-proxy-sar-config\"" Apr 16 18:40:52.999126 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:52.999108 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-e04f1-serving-cert\"" Apr 16 18:40:53.007242 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:53.007223 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m"] Apr 16 18:40:53.079420 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:53.079346 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-openshift-service-ca-bundle\") pod \"splitter-graph-e04f1-7c7f988c75-9hw7m\" (UID: \"a3fdb99e-7095-49cd-97d7-f6c00152f7d2\") " pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:40:53.079603 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:53.079468 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-proxy-tls\") pod \"splitter-graph-e04f1-7c7f988c75-9hw7m\" (UID: \"a3fdb99e-7095-49cd-97d7-f6c00152f7d2\") " pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:40:53.180229 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:53.180194 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-openshift-service-ca-bundle\") pod \"splitter-graph-e04f1-7c7f988c75-9hw7m\" (UID: \"a3fdb99e-7095-49cd-97d7-f6c00152f7d2\") " pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:40:53.180428 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:53.180244 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-proxy-tls\") pod \"splitter-graph-e04f1-7c7f988c75-9hw7m\" (UID: \"a3fdb99e-7095-49cd-97d7-f6c00152f7d2\") " pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:40:53.180428 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:40:53.180369 2583 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-e04f1-serving-cert: secret "splitter-graph-e04f1-serving-cert" not found Apr 16 18:40:53.180547 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:40:53.180434 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-proxy-tls podName:a3fdb99e-7095-49cd-97d7-f6c00152f7d2 nodeName:}" failed. No retries permitted until 2026-04-16 18:40:53.680418142 +0000 UTC m=+1854.269616406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-proxy-tls") pod "splitter-graph-e04f1-7c7f988c75-9hw7m" (UID: "a3fdb99e-7095-49cd-97d7-f6c00152f7d2") : secret "splitter-graph-e04f1-serving-cert" not found Apr 16 18:40:53.180928 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:53.180905 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-openshift-service-ca-bundle\") pod \"splitter-graph-e04f1-7c7f988c75-9hw7m\" (UID: \"a3fdb99e-7095-49cd-97d7-f6c00152f7d2\") " pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:40:53.684105 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:53.684073 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-proxy-tls\") pod \"splitter-graph-e04f1-7c7f988c75-9hw7m\" (UID: \"a3fdb99e-7095-49cd-97d7-f6c00152f7d2\") " pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:40:53.686528 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:53.686504 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-proxy-tls\") pod \"splitter-graph-e04f1-7c7f988c75-9hw7m\" (UID: \"a3fdb99e-7095-49cd-97d7-f6c00152f7d2\") " pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:40:53.906948 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:53.906912 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:40:54.035302 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:54.035256 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m"] Apr 16 18:40:54.038775 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:40:54.038748 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3fdb99e_7095_49cd_97d7_f6c00152f7d2.slice/crio-887f9ed0c0beeec0aaea8f7cdc8d9be738dc07535cb4c5d8ffc8d95fb76946d4 WatchSource:0}: Error finding container 887f9ed0c0beeec0aaea8f7cdc8d9be738dc07535cb4c5d8ffc8d95fb76946d4: Status 404 returned error can't find the container with id 887f9ed0c0beeec0aaea8f7cdc8d9be738dc07535cb4c5d8ffc8d95fb76946d4 Apr 16 18:40:54.040457 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:54.040440 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:40:54.569324 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:54.569292 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" event={"ID":"a3fdb99e-7095-49cd-97d7-f6c00152f7d2","Type":"ContainerStarted","Data":"003028b61c522ac0df1c6848a91be462019fb1999650e2ee3d5e7f5d9a19db5a"} Apr 16 18:40:54.569324 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:54.569329 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" event={"ID":"a3fdb99e-7095-49cd-97d7-f6c00152f7d2","Type":"ContainerStarted","Data":"887f9ed0c0beeec0aaea8f7cdc8d9be738dc07535cb4c5d8ffc8d95fb76946d4"} Apr 16 18:40:54.569547 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:54.569360 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:40:54.584504 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:54.584271 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" podStartSLOduration=2.58425509 podStartE2EDuration="2.58425509s" podCreationTimestamp="2026-04-16 18:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:40:54.584120101 +0000 UTC m=+1855.173318387" watchObservedRunningTime="2026-04-16 18:40:54.58425509 +0000 UTC m=+1855.173453378" Apr 16 18:40:55.088563 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:40:55.088527 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" podUID="b6db6f19-b698-45c4-8a3e-77fe8b193239" containerName="sequence-graph-e9680" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:41:00.093450 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:00.089398 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" podUID="b6db6f19-b698-45c4-8a3e-77fe8b193239" containerName="sequence-graph-e9680" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:41:00.577723 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:00.577698 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:41:02.726307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:02.726282 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" Apr 16 18:41:02.758802 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:02.758776 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6db6f19-b698-45c4-8a3e-77fe8b193239-openshift-service-ca-bundle\") pod \"b6db6f19-b698-45c4-8a3e-77fe8b193239\" (UID: \"b6db6f19-b698-45c4-8a3e-77fe8b193239\") " Apr 16 18:41:02.758802 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:02.758809 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6db6f19-b698-45c4-8a3e-77fe8b193239-proxy-tls\") pod \"b6db6f19-b698-45c4-8a3e-77fe8b193239\" (UID: \"b6db6f19-b698-45c4-8a3e-77fe8b193239\") " Apr 16 18:41:02.759143 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:02.759114 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6db6f19-b698-45c4-8a3e-77fe8b193239-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b6db6f19-b698-45c4-8a3e-77fe8b193239" (UID: "b6db6f19-b698-45c4-8a3e-77fe8b193239"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:41:02.761023 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:02.761000 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6db6f19-b698-45c4-8a3e-77fe8b193239-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b6db6f19-b698-45c4-8a3e-77fe8b193239" (UID: "b6db6f19-b698-45c4-8a3e-77fe8b193239"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:41:02.860115 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:02.860021 2583 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6db6f19-b698-45c4-8a3e-77fe8b193239-openshift-service-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:41:02.860115 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:02.860057 2583 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6db6f19-b698-45c4-8a3e-77fe8b193239-proxy-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:41:03.083728 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:03.083695 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m"] Apr 16 18:41:03.083986 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:03.083943 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" podUID="a3fdb99e-7095-49cd-97d7-f6c00152f7d2" containerName="splitter-graph-e04f1" containerID="cri-o://003028b61c522ac0df1c6848a91be462019fb1999650e2ee3d5e7f5d9a19db5a" gracePeriod=30 Apr 16 18:41:03.594978 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:03.594944 2583 generic.go:358] "Generic (PLEG): container finished" podID="b6db6f19-b698-45c4-8a3e-77fe8b193239" containerID="6f0923837686b63447d32594bccdaea25a2202967373f24098329722be6dbc1e" exitCode=0 Apr 16 18:41:03.595159 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:03.595014 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" event={"ID":"b6db6f19-b698-45c4-8a3e-77fe8b193239","Type":"ContainerDied","Data":"6f0923837686b63447d32594bccdaea25a2202967373f24098329722be6dbc1e"} Apr 16 18:41:03.595159 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:03.595046 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" event={"ID":"b6db6f19-b698-45c4-8a3e-77fe8b193239","Type":"ContainerDied","Data":"2eba46d22669c2417798d67dafa2143dd6746a2fcf061224edf1d3e05d3b11cb"} Apr 16 18:41:03.595159 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:03.595061 2583 scope.go:117] "RemoveContainer" containerID="6f0923837686b63447d32594bccdaea25a2202967373f24098329722be6dbc1e" Apr 16 18:41:03.595159 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:03.595063 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf" Apr 16 18:41:03.604289 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:03.604271 2583 scope.go:117] "RemoveContainer" containerID="6f0923837686b63447d32594bccdaea25a2202967373f24098329722be6dbc1e" Apr 16 18:41:03.604567 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:41:03.604551 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f0923837686b63447d32594bccdaea25a2202967373f24098329722be6dbc1e\": container with ID starting with 6f0923837686b63447d32594bccdaea25a2202967373f24098329722be6dbc1e not found: ID does not exist" containerID="6f0923837686b63447d32594bccdaea25a2202967373f24098329722be6dbc1e" Apr 16 18:41:03.604634 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:03.604592 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f0923837686b63447d32594bccdaea25a2202967373f24098329722be6dbc1e"} err="failed to get container status \"6f0923837686b63447d32594bccdaea25a2202967373f24098329722be6dbc1e\": rpc error: code = NotFound desc = could not find container \"6f0923837686b63447d32594bccdaea25a2202967373f24098329722be6dbc1e\": container with ID starting with 6f0923837686b63447d32594bccdaea25a2202967373f24098329722be6dbc1e not found: ID does not exist" Apr 16 18:41:03.615676 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:03.615652 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf"] Apr 16 18:41:03.621560 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:03.621537 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-e9680-6cdfbbbc6f-c6qcf"] Apr 16 18:41:03.991016 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:03.990977 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6db6f19-b698-45c4-8a3e-77fe8b193239" path="/var/lib/kubelet/pods/b6db6f19-b698-45c4-8a3e-77fe8b193239/volumes" Apr 16 18:41:05.576328 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:05.576291 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" podUID="a3fdb99e-7095-49cd-97d7-f6c00152f7d2" containerName="splitter-graph-e04f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:41:10.576469 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:10.576433 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" podUID="a3fdb99e-7095-49cd-97d7-f6c00152f7d2" containerName="splitter-graph-e04f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:41:15.576805 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:15.576724 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" podUID="a3fdb99e-7095-49cd-97d7-f6c00152f7d2" containerName="splitter-graph-e04f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:41:15.577257 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:15.576828 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:41:20.575913 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:20.575876 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" podUID="a3fdb99e-7095-49cd-97d7-f6c00152f7d2" containerName="splitter-graph-e04f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:41:25.575998 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:25.575959 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" podUID="a3fdb99e-7095-49cd-97d7-f6c00152f7d2" containerName="splitter-graph-e04f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:41:30.576024 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:30.575983 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" podUID="a3fdb99e-7095-49cd-97d7-f6c00152f7d2" containerName="splitter-graph-e04f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:41:33.266090 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.266059 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:41:33.417354 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.417328 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-openshift-service-ca-bundle\") pod \"a3fdb99e-7095-49cd-97d7-f6c00152f7d2\" (UID: \"a3fdb99e-7095-49cd-97d7-f6c00152f7d2\") " Apr 16 18:41:33.417523 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.417404 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-proxy-tls\") pod \"a3fdb99e-7095-49cd-97d7-f6c00152f7d2\" (UID: \"a3fdb99e-7095-49cd-97d7-f6c00152f7d2\") " Apr 16 18:41:33.417769 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.417749 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "a3fdb99e-7095-49cd-97d7-f6c00152f7d2" (UID: "a3fdb99e-7095-49cd-97d7-f6c00152f7d2"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:41:33.419602 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.419565 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a3fdb99e-7095-49cd-97d7-f6c00152f7d2" (UID: "a3fdb99e-7095-49cd-97d7-f6c00152f7d2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:41:33.518169 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.518133 2583 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-openshift-service-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:41:33.518169 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.518163 2583 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3fdb99e-7095-49cd-97d7-f6c00152f7d2-proxy-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:41:33.688359 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.688268 2583 generic.go:358] "Generic (PLEG): container finished" podID="a3fdb99e-7095-49cd-97d7-f6c00152f7d2" containerID="003028b61c522ac0df1c6848a91be462019fb1999650e2ee3d5e7f5d9a19db5a" exitCode=0 Apr 16 18:41:33.688509 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.688367 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" event={"ID":"a3fdb99e-7095-49cd-97d7-f6c00152f7d2","Type":"ContainerDied","Data":"003028b61c522ac0df1c6848a91be462019fb1999650e2ee3d5e7f5d9a19db5a"} Apr 16 18:41:33.688509 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.688395 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" event={"ID":"a3fdb99e-7095-49cd-97d7-f6c00152f7d2","Type":"ContainerDied","Data":"887f9ed0c0beeec0aaea8f7cdc8d9be738dc07535cb4c5d8ffc8d95fb76946d4"} Apr 16 18:41:33.688509 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.688399 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m" Apr 16 18:41:33.688509 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.688414 2583 scope.go:117] "RemoveContainer" containerID="003028b61c522ac0df1c6848a91be462019fb1999650e2ee3d5e7f5d9a19db5a" Apr 16 18:41:33.696928 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.696908 2583 scope.go:117] "RemoveContainer" containerID="003028b61c522ac0df1c6848a91be462019fb1999650e2ee3d5e7f5d9a19db5a" Apr 16 18:41:33.697175 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:41:33.697155 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"003028b61c522ac0df1c6848a91be462019fb1999650e2ee3d5e7f5d9a19db5a\": container with ID starting with 003028b61c522ac0df1c6848a91be462019fb1999650e2ee3d5e7f5d9a19db5a not found: ID does not exist" containerID="003028b61c522ac0df1c6848a91be462019fb1999650e2ee3d5e7f5d9a19db5a" Apr 16 18:41:33.697236 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.697188 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"003028b61c522ac0df1c6848a91be462019fb1999650e2ee3d5e7f5d9a19db5a"} err="failed to get container status \"003028b61c522ac0df1c6848a91be462019fb1999650e2ee3d5e7f5d9a19db5a\": rpc error: code = NotFound desc = could not find container \"003028b61c522ac0df1c6848a91be462019fb1999650e2ee3d5e7f5d9a19db5a\": container with ID starting with 003028b61c522ac0df1c6848a91be462019fb1999650e2ee3d5e7f5d9a19db5a not found: ID does not exist" Apr 16 18:41:33.707992 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.707970 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m"] Apr 16 18:41:33.711418 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.711397 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e04f1-7c7f988c75-9hw7m"] Apr 16 18:41:33.991531 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:33.991454 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3fdb99e-7095-49cd-97d7-f6c00152f7d2" path="/var/lib/kubelet/pods/a3fdb99e-7095-49cd-97d7-f6c00152f7d2/volumes" Apr 16 18:41:42.821961 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:42.821917 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx"] Apr 16 18:41:42.822418 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:42.822260 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6db6f19-b698-45c4-8a3e-77fe8b193239" containerName="sequence-graph-e9680" Apr 16 18:41:42.822418 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:42.822274 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6db6f19-b698-45c4-8a3e-77fe8b193239" containerName="sequence-graph-e9680" Apr 16 18:41:42.822418 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:42.822291 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3fdb99e-7095-49cd-97d7-f6c00152f7d2" containerName="splitter-graph-e04f1" Apr 16 18:41:42.822418 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:42.822301 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fdb99e-7095-49cd-97d7-f6c00152f7d2" containerName="splitter-graph-e04f1" Apr 16 18:41:42.822418 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:42.822346 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3fdb99e-7095-49cd-97d7-f6c00152f7d2" containerName="splitter-graph-e04f1" Apr 16 18:41:42.822418 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:42.822359 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6db6f19-b698-45c4-8a3e-77fe8b193239" containerName="sequence-graph-e9680" Apr 16 18:41:42.826839 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:42.826807 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:41:42.835265 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:42.834196 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-dbc56-serving-cert\"" Apr 16 18:41:42.835265 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:42.834494 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:41:42.835509 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:42.835297 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-dbc56-kube-rbac-proxy-sar-config\"" Apr 16 18:41:42.836522 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:42.836003 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-7d9sz\"" Apr 16 18:41:42.837403 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:42.837376 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx"] Apr 16 18:41:43.000818 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:43.000782 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fcbb6aa-4eba-4397-80a4-53930b788235-proxy-tls\") pod \"switch-graph-dbc56-7657877866-wv6tx\" (UID: \"3fcbb6aa-4eba-4397-80a4-53930b788235\") " pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:41:43.001004 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:43.000832 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fcbb6aa-4eba-4397-80a4-53930b788235-openshift-service-ca-bundle\") pod \"switch-graph-dbc56-7657877866-wv6tx\" (UID: \"3fcbb6aa-4eba-4397-80a4-53930b788235\") " pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:41:43.101342 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:43.101254 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fcbb6aa-4eba-4397-80a4-53930b788235-openshift-service-ca-bundle\") pod \"switch-graph-dbc56-7657877866-wv6tx\" (UID: \"3fcbb6aa-4eba-4397-80a4-53930b788235\") " pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:41:43.101489 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:43.101353 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fcbb6aa-4eba-4397-80a4-53930b788235-proxy-tls\") pod \"switch-graph-dbc56-7657877866-wv6tx\" (UID: \"3fcbb6aa-4eba-4397-80a4-53930b788235\") " pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:41:43.101489 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:41:43.101457 2583 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-dbc56-serving-cert: secret "switch-graph-dbc56-serving-cert" not found Apr 16 18:41:43.101562 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:41:43.101518 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fcbb6aa-4eba-4397-80a4-53930b788235-proxy-tls podName:3fcbb6aa-4eba-4397-80a4-53930b788235 nodeName:}" failed. No retries permitted until 2026-04-16 18:41:43.60149947 +0000 UTC m=+1904.190697938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3fcbb6aa-4eba-4397-80a4-53930b788235-proxy-tls") pod "switch-graph-dbc56-7657877866-wv6tx" (UID: "3fcbb6aa-4eba-4397-80a4-53930b788235") : secret "switch-graph-dbc56-serving-cert" not found Apr 16 18:41:43.101957 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:43.101937 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fcbb6aa-4eba-4397-80a4-53930b788235-openshift-service-ca-bundle\") pod \"switch-graph-dbc56-7657877866-wv6tx\" (UID: \"3fcbb6aa-4eba-4397-80a4-53930b788235\") " pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:41:43.604893 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:43.604854 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fcbb6aa-4eba-4397-80a4-53930b788235-proxy-tls\") pod \"switch-graph-dbc56-7657877866-wv6tx\" (UID: \"3fcbb6aa-4eba-4397-80a4-53930b788235\") " pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:41:43.607507 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:43.607482 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fcbb6aa-4eba-4397-80a4-53930b788235-proxy-tls\") pod \"switch-graph-dbc56-7657877866-wv6tx\" (UID: \"3fcbb6aa-4eba-4397-80a4-53930b788235\") " pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:41:43.747134 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:43.747096 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:41:43.869100 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:43.869028 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx"] Apr 16 18:41:43.872376 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:41:43.872348 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fcbb6aa_4eba_4397_80a4_53930b788235.slice/crio-a424697ccab94770dc20dbba7c67a358f7871cdca058cd40fd546e450a4bd565 WatchSource:0}: Error finding container a424697ccab94770dc20dbba7c67a358f7871cdca058cd40fd546e450a4bd565: Status 404 returned error can't find the container with id a424697ccab94770dc20dbba7c67a358f7871cdca058cd40fd546e450a4bd565 Apr 16 18:41:44.723441 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:44.723395 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" event={"ID":"3fcbb6aa-4eba-4397-80a4-53930b788235","Type":"ContainerStarted","Data":"ceff01d5bbcc5b66b29c5d93085bb5818ac90313c7df42e0edcfb17f614fbe52"} Apr 16 18:41:44.723441 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:44.723438 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" event={"ID":"3fcbb6aa-4eba-4397-80a4-53930b788235","Type":"ContainerStarted","Data":"a424697ccab94770dc20dbba7c67a358f7871cdca058cd40fd546e450a4bd565"} Apr 16 18:41:44.723687 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:44.723476 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:41:44.738400 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:44.738352 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" podStartSLOduration=2.738337633 podStartE2EDuration="2.738337633s" podCreationTimestamp="2026-04-16 18:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:41:44.737448048 +0000 UTC m=+1905.326646335" watchObservedRunningTime="2026-04-16 18:41:44.738337633 +0000 UTC m=+1905.327535919" Apr 16 18:41:50.731962 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:41:50.731931 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:42:13.325862 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.325832 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj"] Apr 16 18:42:13.330256 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.330240 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" Apr 16 18:42:13.333080 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.333060 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-1ac5a-serving-cert\"" Apr 16 18:42:13.333352 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.333338 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-1ac5a-kube-rbac-proxy-sar-config\"" Apr 16 18:42:13.337297 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.336902 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj"] Apr 16 18:42:13.444272 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.444238 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f3f7f20-6cfe-4131-ad6a-809a0361a7b6-proxy-tls\") pod \"splitter-graph-1ac5a-67bcd68668-mm8sj\" (UID: \"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6\") " pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" Apr 16 18:42:13.444448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.444294 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f3f7f20-6cfe-4131-ad6a-809a0361a7b6-openshift-service-ca-bundle\") pod \"splitter-graph-1ac5a-67bcd68668-mm8sj\" (UID: \"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6\") " pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" Apr 16 18:42:13.544940 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.544901 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f3f7f20-6cfe-4131-ad6a-809a0361a7b6-proxy-tls\") pod \"splitter-graph-1ac5a-67bcd68668-mm8sj\" (UID: \"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6\") " pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" Apr 16 18:42:13.545125 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.544954 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f3f7f20-6cfe-4131-ad6a-809a0361a7b6-openshift-service-ca-bundle\") pod \"splitter-graph-1ac5a-67bcd68668-mm8sj\" (UID: \"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6\") " pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" Apr 16 18:42:13.545611 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.545572 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f3f7f20-6cfe-4131-ad6a-809a0361a7b6-openshift-service-ca-bundle\") pod \"splitter-graph-1ac5a-67bcd68668-mm8sj\" (UID: \"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6\") " pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" Apr 16 18:42:13.547477 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.547456 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f3f7f20-6cfe-4131-ad6a-809a0361a7b6-proxy-tls\") pod \"splitter-graph-1ac5a-67bcd68668-mm8sj\" (UID: \"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6\") " pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" Apr 16 18:42:13.640466 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.640444 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" Apr 16 18:42:13.761417 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.761319 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj"] Apr 16 18:42:13.764179 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:42:13.764151 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f3f7f20_6cfe_4131_ad6a_809a0361a7b6.slice/crio-41ae1aa5d61d7841902fa1d464bb73977e3cf406225e6a88c852ddd8a94095f8 WatchSource:0}: Error finding container 41ae1aa5d61d7841902fa1d464bb73977e3cf406225e6a88c852ddd8a94095f8: Status 404 returned error can't find the container with id 41ae1aa5d61d7841902fa1d464bb73977e3cf406225e6a88c852ddd8a94095f8 Apr 16 18:42:13.812560 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:13.812538 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" event={"ID":"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6","Type":"ContainerStarted","Data":"41ae1aa5d61d7841902fa1d464bb73977e3cf406225e6a88c852ddd8a94095f8"} Apr 16 18:42:14.816966 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:14.816933 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" event={"ID":"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6","Type":"ContainerStarted","Data":"80053861420f3897ca67ce479e21d26877bbaa0a0d1164342bde6a9ab32434e2"} Apr 16 18:42:14.817359 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:14.817070 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" Apr 16 18:42:14.833148 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:14.833096 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" podStartSLOduration=1.833078669 podStartE2EDuration="1.833078669s" podCreationTimestamp="2026-04-16 18:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:42:14.831512152 +0000 UTC m=+1935.420710437" watchObservedRunningTime="2026-04-16 18:42:14.833078669 +0000 UTC m=+1935.422276954" Apr 16 18:42:20.825500 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:42:20.825468 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" Apr 16 18:45:00.012743 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:45:00.012716 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:45:00.015500 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:45:00.015479 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:50:00.034467 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:00.034433 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:50:00.037942 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:00.037921 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:50:28.077890 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:28.077809 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj"] Apr 16 18:50:28.078403 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:28.078039 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" podUID="9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" containerName="splitter-graph-1ac5a" containerID="cri-o://80053861420f3897ca67ce479e21d26877bbaa0a0d1164342bde6a9ab32434e2" gracePeriod=30 Apr 16 18:50:30.824351 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:30.824312 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" podUID="9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" containerName="splitter-graph-1ac5a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:50:35.823630 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:35.823565 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" podUID="9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" containerName="splitter-graph-1ac5a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:50:40.823778 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:40.823742 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" podUID="9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" containerName="splitter-graph-1ac5a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:50:40.824186 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:40.823850 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" Apr 16 18:50:45.824452 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:45.824405 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" podUID="9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" containerName="splitter-graph-1ac5a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:50:50.823945 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:50.823893 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" podUID="9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" containerName="splitter-graph-1ac5a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:50:55.823998 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:55.823958 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" podUID="9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" containerName="splitter-graph-1ac5a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:50:58.225358 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.225332 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" Apr 16 18:50:58.310956 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.310907 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f3f7f20-6cfe-4131-ad6a-809a0361a7b6-proxy-tls\") pod \"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6\" (UID: \"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6\") " Apr 16 18:50:58.311158 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.311020 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f3f7f20-6cfe-4131-ad6a-809a0361a7b6-openshift-service-ca-bundle\") pod \"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6\" (UID: \"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6\") " Apr 16 18:50:58.311356 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.311331 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3f7f20-6cfe-4131-ad6a-809a0361a7b6-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" (UID: "9f3f7f20-6cfe-4131-ad6a-809a0361a7b6"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:50:58.313177 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.313146 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f3f7f20-6cfe-4131-ad6a-809a0361a7b6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" (UID: "9f3f7f20-6cfe-4131-ad6a-809a0361a7b6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:50:58.334849 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.334750 2583 generic.go:358] "Generic (PLEG): container finished" podID="9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" containerID="80053861420f3897ca67ce479e21d26877bbaa0a0d1164342bde6a9ab32434e2" exitCode=0 Apr 16 18:50:58.334849 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.334811 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" Apr 16 18:50:58.334849 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.334813 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" event={"ID":"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6","Type":"ContainerDied","Data":"80053861420f3897ca67ce479e21d26877bbaa0a0d1164342bde6a9ab32434e2"} Apr 16 18:50:58.335117 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.334860 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj" event={"ID":"9f3f7f20-6cfe-4131-ad6a-809a0361a7b6","Type":"ContainerDied","Data":"41ae1aa5d61d7841902fa1d464bb73977e3cf406225e6a88c852ddd8a94095f8"} Apr 16 18:50:58.335117 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.334881 2583 scope.go:117] "RemoveContainer" containerID="80053861420f3897ca67ce479e21d26877bbaa0a0d1164342bde6a9ab32434e2" Apr 16 18:50:58.342722 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.342697 2583 scope.go:117] "RemoveContainer" containerID="80053861420f3897ca67ce479e21d26877bbaa0a0d1164342bde6a9ab32434e2" Apr 16 18:50:58.342977 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:50:58.342958 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80053861420f3897ca67ce479e21d26877bbaa0a0d1164342bde6a9ab32434e2\": container with ID starting with 80053861420f3897ca67ce479e21d26877bbaa0a0d1164342bde6a9ab32434e2 not found: ID does not exist" containerID="80053861420f3897ca67ce479e21d26877bbaa0a0d1164342bde6a9ab32434e2" Apr 16 18:50:58.343038 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.342985 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80053861420f3897ca67ce479e21d26877bbaa0a0d1164342bde6a9ab32434e2"} err="failed to get container status \"80053861420f3897ca67ce479e21d26877bbaa0a0d1164342bde6a9ab32434e2\": rpc error: code = NotFound desc = could not find container \"80053861420f3897ca67ce479e21d26877bbaa0a0d1164342bde6a9ab32434e2\": container with ID starting with 80053861420f3897ca67ce479e21d26877bbaa0a0d1164342bde6a9ab32434e2 not found: ID does not exist" Apr 16 18:50:58.355724 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.355692 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj"] Apr 16 18:50:58.358951 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.358926 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-1ac5a-67bcd68668-mm8sj"] Apr 16 18:50:58.412413 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.412381 2583 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f3f7f20-6cfe-4131-ad6a-809a0361a7b6-openshift-service-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:50:58.412413 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:58.412408 2583 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f3f7f20-6cfe-4131-ad6a-809a0361a7b6-proxy-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:50:59.991129 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:50:59.991098 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" path="/var/lib/kubelet/pods/9f3f7f20-6cfe-4131-ad6a-809a0361a7b6/volumes" Apr 16 18:55:00.058130 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:55:00.058010 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:55:00.062231 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:55:00.061320 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:58:02.303809 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:02.303775 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx"] Apr 16 18:58:02.304357 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:02.304035 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" podUID="3fcbb6aa-4eba-4397-80a4-53930b788235" containerName="switch-graph-dbc56" containerID="cri-o://ceff01d5bbcc5b66b29c5d93085bb5818ac90313c7df42e0edcfb17f614fbe52" gracePeriod=30 Apr 16 18:58:03.735350 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.735310 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jj7nc/must-gather-cp8hl"] Apr 16 18:58:03.735754 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.735645 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" containerName="splitter-graph-1ac5a" Apr 16 18:58:03.735754 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.735657 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" containerName="splitter-graph-1ac5a" Apr 16 18:58:03.735754 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.735726 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f3f7f20-6cfe-4131-ad6a-809a0361a7b6" containerName="splitter-graph-1ac5a" Apr 16 18:58:03.738638 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.738621 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" Apr 16 18:58:03.740787 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.740768 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jj7nc\"/\"openshift-service-ca.crt\"" Apr 16 18:58:03.741411 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.741393 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jj7nc\"/\"kube-root-ca.crt\"" Apr 16 18:58:03.741502 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.741395 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jj7nc\"/\"default-dockercfg-9zpjg\"" Apr 16 18:58:03.754591 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.754562 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jj7nc/must-gather-cp8hl"] Apr 16 18:58:03.885129 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.885088 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9e204d4-1573-43dd-be90-26ce38773d6d-must-gather-output\") pod \"must-gather-cp8hl\" (UID: \"c9e204d4-1573-43dd-be90-26ce38773d6d\") " pod="openshift-must-gather-jj7nc/must-gather-cp8hl" Apr 16 18:58:03.885307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.885209 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csb25\" (UniqueName: \"kubernetes.io/projected/c9e204d4-1573-43dd-be90-26ce38773d6d-kube-api-access-csb25\") pod \"must-gather-cp8hl\" (UID: \"c9e204d4-1573-43dd-be90-26ce38773d6d\") " pod="openshift-must-gather-jj7nc/must-gather-cp8hl" Apr 16 18:58:03.986538 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.986439 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csb25\" (UniqueName: \"kubernetes.io/projected/c9e204d4-1573-43dd-be90-26ce38773d6d-kube-api-access-csb25\") pod \"must-gather-cp8hl\" (UID: \"c9e204d4-1573-43dd-be90-26ce38773d6d\") " pod="openshift-must-gather-jj7nc/must-gather-cp8hl" Apr 16 18:58:03.986538 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.986514 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9e204d4-1573-43dd-be90-26ce38773d6d-must-gather-output\") pod \"must-gather-cp8hl\" (UID: \"c9e204d4-1573-43dd-be90-26ce38773d6d\") " pod="openshift-must-gather-jj7nc/must-gather-cp8hl" Apr 16 18:58:03.986840 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.986824 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9e204d4-1573-43dd-be90-26ce38773d6d-must-gather-output\") pod \"must-gather-cp8hl\" (UID: \"c9e204d4-1573-43dd-be90-26ce38773d6d\") " pod="openshift-must-gather-jj7nc/must-gather-cp8hl" Apr 16 18:58:03.996215 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:03.996183 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csb25\" (UniqueName: \"kubernetes.io/projected/c9e204d4-1573-43dd-be90-26ce38773d6d-kube-api-access-csb25\") pod \"must-gather-cp8hl\" (UID: \"c9e204d4-1573-43dd-be90-26ce38773d6d\") " pod="openshift-must-gather-jj7nc/must-gather-cp8hl" Apr 16 18:58:04.060733 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:04.060699 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" Apr 16 18:58:04.183646 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:04.183613 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jj7nc/must-gather-cp8hl"] Apr 16 18:58:04.186762 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:58:04.186732 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9e204d4_1573_43dd_be90_26ce38773d6d.slice/crio-708f74c015d368024d29db35fef5a3c6d1e12610c0ce65cd938a0d778f859bc7 WatchSource:0}: Error finding container 708f74c015d368024d29db35fef5a3c6d1e12610c0ce65cd938a0d778f859bc7: Status 404 returned error can't find the container with id 708f74c015d368024d29db35fef5a3c6d1e12610c0ce65cd938a0d778f859bc7 Apr 16 18:58:04.188463 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:04.188448 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:58:04.573448 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:04.573403 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" event={"ID":"c9e204d4-1573-43dd-be90-26ce38773d6d","Type":"ContainerStarted","Data":"708f74c015d368024d29db35fef5a3c6d1e12610c0ce65cd938a0d778f859bc7"} Apr 16 18:58:05.731549 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:05.731487 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" podUID="3fcbb6aa-4eba-4397-80a4-53930b788235" containerName="switch-graph-dbc56" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:58:08.587460 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:08.587358 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" event={"ID":"c9e204d4-1573-43dd-be90-26ce38773d6d","Type":"ContainerStarted","Data":"1930411b6b6fc6de9f6b7af888d717b3f327a9ba5063f5bdcfe093b23fb16ea0"} Apr 16 18:58:08.587460 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:08.587414 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" event={"ID":"c9e204d4-1573-43dd-be90-26ce38773d6d","Type":"ContainerStarted","Data":"42f149ebb69b147f22d2a93da3ccfda69787b08f7eefaf8dabdaaa3de61fa95d"} Apr 16 18:58:08.603702 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:08.603641 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" podStartSLOduration=1.548742346 podStartE2EDuration="5.603620862s" podCreationTimestamp="2026-04-16 18:58:03 +0000 UTC" firstStartedPulling="2026-04-16 18:58:04.18857018 +0000 UTC m=+2884.777768444" lastFinishedPulling="2026-04-16 18:58:08.243448693 +0000 UTC m=+2888.832646960" observedRunningTime="2026-04-16 18:58:08.601216839 +0000 UTC m=+2889.190415143" watchObservedRunningTime="2026-04-16 18:58:08.603620862 +0000 UTC m=+2889.192819149" Apr 16 18:58:10.730113 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:10.730068 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" podUID="3fcbb6aa-4eba-4397-80a4-53930b788235" containerName="switch-graph-dbc56" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:58:15.731971 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:15.731931 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" podUID="3fcbb6aa-4eba-4397-80a4-53930b788235" containerName="switch-graph-dbc56" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:58:15.732430 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:15.732044 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:58:17.377685 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:17.377654 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-dbc56-7657877866-wv6tx_3fcbb6aa-4eba-4397-80a4-53930b788235/switch-graph-dbc56/0.log" Apr 16 18:58:18.148166 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:18.148135 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-dbc56-7657877866-wv6tx_3fcbb6aa-4eba-4397-80a4-53930b788235/switch-graph-dbc56/0.log" Apr 16 18:58:18.892365 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:18.892332 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-dbc56-7657877866-wv6tx_3fcbb6aa-4eba-4397-80a4-53930b788235/switch-graph-dbc56/0.log" Apr 16 18:58:19.616324 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:19.616286 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-dbc56-7657877866-wv6tx_3fcbb6aa-4eba-4397-80a4-53930b788235/switch-graph-dbc56/0.log" Apr 16 18:58:20.371865 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:20.371825 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-dbc56-7657877866-wv6tx_3fcbb6aa-4eba-4397-80a4-53930b788235/switch-graph-dbc56/0.log" Apr 16 18:58:20.729554 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:20.729510 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" podUID="3fcbb6aa-4eba-4397-80a4-53930b788235" containerName="switch-graph-dbc56" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:58:21.112682 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:21.112568 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-dbc56-7657877866-wv6tx_3fcbb6aa-4eba-4397-80a4-53930b788235/switch-graph-dbc56/0.log" Apr 16 18:58:21.857307 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:21.857273 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-dbc56-7657877866-wv6tx_3fcbb6aa-4eba-4397-80a4-53930b788235/switch-graph-dbc56/0.log" Apr 16 18:58:22.596881 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:22.596856 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-dbc56-7657877866-wv6tx_3fcbb6aa-4eba-4397-80a4-53930b788235/switch-graph-dbc56/0.log" Apr 16 18:58:23.326863 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:23.326827 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-dbc56-7657877866-wv6tx_3fcbb6aa-4eba-4397-80a4-53930b788235/switch-graph-dbc56/0.log" Apr 16 18:58:24.063473 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:24.063445 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-dbc56-7657877866-wv6tx_3fcbb6aa-4eba-4397-80a4-53930b788235/switch-graph-dbc56/0.log" Apr 16 18:58:24.791854 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:24.791820 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-dbc56-7657877866-wv6tx_3fcbb6aa-4eba-4397-80a4-53930b788235/switch-graph-dbc56/0.log" Apr 16 18:58:25.567936 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:25.567908 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-dbc56-7657877866-wv6tx_3fcbb6aa-4eba-4397-80a4-53930b788235/switch-graph-dbc56/0.log" Apr 16 18:58:25.730258 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:25.730215 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" podUID="3fcbb6aa-4eba-4397-80a4-53930b788235" containerName="switch-graph-dbc56" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:58:26.646834 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:26.646801 2583 generic.go:358] "Generic (PLEG): container finished" podID="c9e204d4-1573-43dd-be90-26ce38773d6d" containerID="42f149ebb69b147f22d2a93da3ccfda69787b08f7eefaf8dabdaaa3de61fa95d" exitCode=0 Apr 16 18:58:26.647247 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:26.646875 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" event={"ID":"c9e204d4-1573-43dd-be90-26ce38773d6d","Type":"ContainerDied","Data":"42f149ebb69b147f22d2a93da3ccfda69787b08f7eefaf8dabdaaa3de61fa95d"} Apr 16 18:58:26.647247 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:26.647230 2583 scope.go:117] "RemoveContainer" containerID="42f149ebb69b147f22d2a93da3ccfda69787b08f7eefaf8dabdaaa3de61fa95d" Apr 16 18:58:27.536832 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:27.536800 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jj7nc_must-gather-cp8hl_c9e204d4-1573-43dd-be90-26ce38773d6d/gather/0.log" Apr 16 18:58:28.142777 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.142741 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lv8lf/must-gather-s4pvl"] Apr 16 18:58:28.146206 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.146192 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lv8lf/must-gather-s4pvl" Apr 16 18:58:28.148529 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.148495 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lv8lf\"/\"openshift-service-ca.crt\"" Apr 16 18:58:28.149080 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.149059 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lv8lf\"/\"default-dockercfg-r8bqr\"" Apr 16 18:58:28.149168 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.149081 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lv8lf\"/\"kube-root-ca.crt\"" Apr 16 18:58:28.153699 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.153679 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lv8lf/must-gather-s4pvl"] Apr 16 18:58:28.195901 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.195875 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sktn7\" (UniqueName: \"kubernetes.io/projected/18803de7-8b08-4f6d-adce-00934d70032f-kube-api-access-sktn7\") pod \"must-gather-s4pvl\" (UID: \"18803de7-8b08-4f6d-adce-00934d70032f\") " pod="openshift-must-gather-lv8lf/must-gather-s4pvl" Apr 16 18:58:28.196044 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.195909 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/18803de7-8b08-4f6d-adce-00934d70032f-must-gather-output\") pod \"must-gather-s4pvl\" (UID: \"18803de7-8b08-4f6d-adce-00934d70032f\") " pod="openshift-must-gather-lv8lf/must-gather-s4pvl" Apr 16 18:58:28.296409 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.296366 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/18803de7-8b08-4f6d-adce-00934d70032f-must-gather-output\") pod \"must-gather-s4pvl\" (UID: \"18803de7-8b08-4f6d-adce-00934d70032f\") " pod="openshift-must-gather-lv8lf/must-gather-s4pvl" Apr 16 18:58:28.296610 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.296460 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sktn7\" (UniqueName: \"kubernetes.io/projected/18803de7-8b08-4f6d-adce-00934d70032f-kube-api-access-sktn7\") pod \"must-gather-s4pvl\" (UID: \"18803de7-8b08-4f6d-adce-00934d70032f\") " pod="openshift-must-gather-lv8lf/must-gather-s4pvl" Apr 16 18:58:28.296802 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.296778 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/18803de7-8b08-4f6d-adce-00934d70032f-must-gather-output\") pod \"must-gather-s4pvl\" (UID: \"18803de7-8b08-4f6d-adce-00934d70032f\") " pod="openshift-must-gather-lv8lf/must-gather-s4pvl" Apr 16 18:58:28.305006 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.304974 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sktn7\" (UniqueName: \"kubernetes.io/projected/18803de7-8b08-4f6d-adce-00934d70032f-kube-api-access-sktn7\") pod \"must-gather-s4pvl\" (UID: \"18803de7-8b08-4f6d-adce-00934d70032f\") " pod="openshift-must-gather-lv8lf/must-gather-s4pvl" Apr 16 18:58:28.455962 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.455865 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lv8lf/must-gather-s4pvl" Apr 16 18:58:28.578230 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.578106 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lv8lf/must-gather-s4pvl"] Apr 16 18:58:28.580844 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:58:28.580812 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18803de7_8b08_4f6d_adce_00934d70032f.slice/crio-29c4fe2578cbc069e423ce6524ab2f17dfe101115306a3431bba0e0f07a9b5ae WatchSource:0}: Error finding container 29c4fe2578cbc069e423ce6524ab2f17dfe101115306a3431bba0e0f07a9b5ae: Status 404 returned error can't find the container with id 29c4fe2578cbc069e423ce6524ab2f17dfe101115306a3431bba0e0f07a9b5ae Apr 16 18:58:28.653221 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:28.653177 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lv8lf/must-gather-s4pvl" event={"ID":"18803de7-8b08-4f6d-adce-00934d70032f","Type":"ContainerStarted","Data":"29c4fe2578cbc069e423ce6524ab2f17dfe101115306a3431bba0e0f07a9b5ae"} Apr 16 18:58:29.660788 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:29.660742 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lv8lf/must-gather-s4pvl" event={"ID":"18803de7-8b08-4f6d-adce-00934d70032f","Type":"ContainerStarted","Data":"e09147a46dfdf37dce50a1bd7bed4a1c097ea9400a2027108ffca7034d099b71"} Apr 16 18:58:30.665461 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:30.665424 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lv8lf/must-gather-s4pvl" event={"ID":"18803de7-8b08-4f6d-adce-00934d70032f","Type":"ContainerStarted","Data":"d6fff459347e00ed3ab2971f156aa1cf768fa6462861d7459c36648fb5e327d3"} Apr 16 18:58:30.680171 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:30.679725 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lv8lf/must-gather-s4pvl" podStartSLOduration=1.8573187230000001 podStartE2EDuration="2.679706922s" podCreationTimestamp="2026-04-16 18:58:28 +0000 UTC" firstStartedPulling="2026-04-16 18:58:28.582611524 +0000 UTC m=+2909.171809789" lastFinishedPulling="2026-04-16 18:58:29.404999719 +0000 UTC m=+2909.994197988" observedRunningTime="2026-04-16 18:58:30.678964219 +0000 UTC m=+2911.268162506" watchObservedRunningTime="2026-04-16 18:58:30.679706922 +0000 UTC m=+2911.268905214" Apr 16 18:58:30.731237 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:30.731199 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" podUID="3fcbb6aa-4eba-4397-80a4-53930b788235" containerName="switch-graph-dbc56" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:58:30.787303 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:30.787266 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-569hm_5a325d12-fc69-40c7-a5dd-aa1bb836aedd/global-pull-secret-syncer/0.log" Apr 16 18:58:31.021487 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:31.021400 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vhpgc_b4b5a677-a74e-41a0-9821-6b56e1c0328c/konnectivity-agent/0.log" Apr 16 18:58:31.047423 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:31.047376 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-95.ec2.internal_1c631bb54e7f931dcf513f44e89f6bf7/haproxy/0.log" Apr 16 18:58:32.631599 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.631292 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:58:32.638602 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.637781 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fcbb6aa-4eba-4397-80a4-53930b788235-openshift-service-ca-bundle\") pod \"3fcbb6aa-4eba-4397-80a4-53930b788235\" (UID: \"3fcbb6aa-4eba-4397-80a4-53930b788235\") " Apr 16 18:58:32.638602 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.637830 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fcbb6aa-4eba-4397-80a4-53930b788235-proxy-tls\") pod \"3fcbb6aa-4eba-4397-80a4-53930b788235\" (UID: \"3fcbb6aa-4eba-4397-80a4-53930b788235\") " Apr 16 18:58:32.638850 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.638694 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fcbb6aa-4eba-4397-80a4-53930b788235-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "3fcbb6aa-4eba-4397-80a4-53930b788235" (UID: "3fcbb6aa-4eba-4397-80a4-53930b788235"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:58:32.643599 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.640830 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fcbb6aa-4eba-4397-80a4-53930b788235-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3fcbb6aa-4eba-4397-80a4-53930b788235" (UID: "3fcbb6aa-4eba-4397-80a4-53930b788235"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:58:32.679602 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.678736 2583 generic.go:358] "Generic (PLEG): container finished" podID="3fcbb6aa-4eba-4397-80a4-53930b788235" containerID="ceff01d5bbcc5b66b29c5d93085bb5818ac90313c7df42e0edcfb17f614fbe52" exitCode=0 Apr 16 18:58:32.679602 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.678842 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" event={"ID":"3fcbb6aa-4eba-4397-80a4-53930b788235","Type":"ContainerDied","Data":"ceff01d5bbcc5b66b29c5d93085bb5818ac90313c7df42e0edcfb17f614fbe52"} Apr 16 18:58:32.679602 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.678872 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" event={"ID":"3fcbb6aa-4eba-4397-80a4-53930b788235","Type":"ContainerDied","Data":"a424697ccab94770dc20dbba7c67a358f7871cdca058cd40fd546e450a4bd565"} Apr 16 18:58:32.679602 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.678892 2583 scope.go:117] "RemoveContainer" containerID="ceff01d5bbcc5b66b29c5d93085bb5818ac90313c7df42e0edcfb17f614fbe52" Apr 16 18:58:32.679602 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.679047 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx" Apr 16 18:58:32.706098 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.705970 2583 scope.go:117] "RemoveContainer" containerID="ceff01d5bbcc5b66b29c5d93085bb5818ac90313c7df42e0edcfb17f614fbe52" Apr 16 18:58:32.706990 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:58:32.706338 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceff01d5bbcc5b66b29c5d93085bb5818ac90313c7df42e0edcfb17f614fbe52\": container with ID starting with ceff01d5bbcc5b66b29c5d93085bb5818ac90313c7df42e0edcfb17f614fbe52 not found: ID does not exist" containerID="ceff01d5bbcc5b66b29c5d93085bb5818ac90313c7df42e0edcfb17f614fbe52" Apr 16 18:58:32.706990 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.706375 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceff01d5bbcc5b66b29c5d93085bb5818ac90313c7df42e0edcfb17f614fbe52"} err="failed to get container status \"ceff01d5bbcc5b66b29c5d93085bb5818ac90313c7df42e0edcfb17f614fbe52\": rpc error: code = NotFound desc = could not find container \"ceff01d5bbcc5b66b29c5d93085bb5818ac90313c7df42e0edcfb17f614fbe52\": container with ID starting with ceff01d5bbcc5b66b29c5d93085bb5818ac90313c7df42e0edcfb17f614fbe52 not found: ID does not exist" Apr 16 18:58:32.716538 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.713418 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx"] Apr 16 18:58:32.722555 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.719567 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-dbc56-7657877866-wv6tx"] Apr 16 18:58:32.741384 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.738504 2583 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fcbb6aa-4eba-4397-80a4-53930b788235-openshift-service-ca-bundle\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:58:32.741384 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.738536 2583 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3fcbb6aa-4eba-4397-80a4-53930b788235-proxy-tls\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:58:32.983292 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.982402 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jj7nc/must-gather-cp8hl"] Apr 16 18:58:32.983292 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.982473 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jj7nc/must-gather-cp8hl"] Apr 16 18:58:32.984921 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.984300 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" podUID="c9e204d4-1573-43dd-be90-26ce38773d6d" containerName="copy" containerID="cri-o://1930411b6b6fc6de9f6b7af888d717b3f327a9ba5063f5bdcfe093b23fb16ea0" gracePeriod=2 Apr 16 18:58:32.987140 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:32.987089 2583 status_manager.go:895] "Failed to get status for pod" podUID="c9e204d4-1573-43dd-be90-26ce38773d6d" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" err="pods \"must-gather-cp8hl\" is forbidden: User \"system:node:ip-10-0-128-95.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jj7nc\": no relationship found between node 'ip-10-0-128-95.ec2.internal' and this object" Apr 16 18:58:33.347605 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.345758 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jj7nc_must-gather-cp8hl_c9e204d4-1573-43dd-be90-26ce38773d6d/copy/0.log" Apr 16 18:58:33.347605 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.346240 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" Apr 16 18:58:33.348608 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.348088 2583 status_manager.go:895] "Failed to get status for pod" podUID="c9e204d4-1573-43dd-be90-26ce38773d6d" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" err="pods \"must-gather-cp8hl\" is forbidden: User \"system:node:ip-10-0-128-95.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jj7nc\": no relationship found between node 'ip-10-0-128-95.ec2.internal' and this object" Apr 16 18:58:33.355601 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.354048 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csb25\" (UniqueName: \"kubernetes.io/projected/c9e204d4-1573-43dd-be90-26ce38773d6d-kube-api-access-csb25\") pod \"c9e204d4-1573-43dd-be90-26ce38773d6d\" (UID: \"c9e204d4-1573-43dd-be90-26ce38773d6d\") " Apr 16 18:58:33.355601 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.354133 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9e204d4-1573-43dd-be90-26ce38773d6d-must-gather-output\") pod \"c9e204d4-1573-43dd-be90-26ce38773d6d\" (UID: \"c9e204d4-1573-43dd-be90-26ce38773d6d\") " Apr 16 18:58:33.359596 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.357197 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e204d4-1573-43dd-be90-26ce38773d6d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c9e204d4-1573-43dd-be90-26ce38773d6d" (UID: "c9e204d4-1573-43dd-be90-26ce38773d6d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:58:33.368095 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.361706 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e204d4-1573-43dd-be90-26ce38773d6d-kube-api-access-csb25" (OuterVolumeSpecName: "kube-api-access-csb25") pod "c9e204d4-1573-43dd-be90-26ce38773d6d" (UID: "c9e204d4-1573-43dd-be90-26ce38773d6d"). InnerVolumeSpecName "kube-api-access-csb25". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:58:33.456336 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.456294 2583 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c9e204d4-1573-43dd-be90-26ce38773d6d-must-gather-output\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:58:33.456336 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.456336 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-csb25\" (UniqueName: \"kubernetes.io/projected/c9e204d4-1573-43dd-be90-26ce38773d6d-kube-api-access-csb25\") on node \"ip-10-0-128-95.ec2.internal\" DevicePath \"\"" Apr 16 18:58:33.686381 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.686345 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jj7nc_must-gather-cp8hl_c9e204d4-1573-43dd-be90-26ce38773d6d/copy/0.log" Apr 16 18:58:33.686861 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.686777 2583 generic.go:358] "Generic (PLEG): container finished" podID="c9e204d4-1573-43dd-be90-26ce38773d6d" containerID="1930411b6b6fc6de9f6b7af888d717b3f327a9ba5063f5bdcfe093b23fb16ea0" exitCode=143 Apr 16 18:58:33.686923 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.686884 2583 scope.go:117] "RemoveContainer" containerID="1930411b6b6fc6de9f6b7af888d717b3f327a9ba5063f5bdcfe093b23fb16ea0" Apr 16 18:58:33.687032 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.687014 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" Apr 16 18:58:33.699537 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.699467 2583 status_manager.go:895] "Failed to get status for pod" podUID="c9e204d4-1573-43dd-be90-26ce38773d6d" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" err="pods \"must-gather-cp8hl\" is forbidden: User \"system:node:ip-10-0-128-95.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jj7nc\": no relationship found between node 'ip-10-0-128-95.ec2.internal' and this object" Apr 16 18:58:33.705209 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.705164 2583 status_manager.go:895] "Failed to get status for pod" podUID="c9e204d4-1573-43dd-be90-26ce38773d6d" pod="openshift-must-gather-jj7nc/must-gather-cp8hl" err="pods \"must-gather-cp8hl\" is forbidden: User \"system:node:ip-10-0-128-95.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jj7nc\": no relationship found between node 'ip-10-0-128-95.ec2.internal' and this object" Apr 16 18:58:33.716130 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.714885 2583 scope.go:117] "RemoveContainer" containerID="42f149ebb69b147f22d2a93da3ccfda69787b08f7eefaf8dabdaaa3de61fa95d" Apr 16 18:58:33.742523 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.742441 2583 scope.go:117] "RemoveContainer" containerID="1930411b6b6fc6de9f6b7af888d717b3f327a9ba5063f5bdcfe093b23fb16ea0" Apr 16 18:58:33.744325 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:58:33.744254 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1930411b6b6fc6de9f6b7af888d717b3f327a9ba5063f5bdcfe093b23fb16ea0\": container with ID starting with 1930411b6b6fc6de9f6b7af888d717b3f327a9ba5063f5bdcfe093b23fb16ea0 not found: ID does not exist" containerID="1930411b6b6fc6de9f6b7af888d717b3f327a9ba5063f5bdcfe093b23fb16ea0" Apr 16 18:58:33.744325 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.744297 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1930411b6b6fc6de9f6b7af888d717b3f327a9ba5063f5bdcfe093b23fb16ea0"} err="failed to get container status \"1930411b6b6fc6de9f6b7af888d717b3f327a9ba5063f5bdcfe093b23fb16ea0\": rpc error: code = NotFound desc = could not find container \"1930411b6b6fc6de9f6b7af888d717b3f327a9ba5063f5bdcfe093b23fb16ea0\": container with ID starting with 1930411b6b6fc6de9f6b7af888d717b3f327a9ba5063f5bdcfe093b23fb16ea0 not found: ID does not exist" Apr 16 18:58:33.744325 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.744323 2583 scope.go:117] "RemoveContainer" containerID="42f149ebb69b147f22d2a93da3ccfda69787b08f7eefaf8dabdaaa3de61fa95d" Apr 16 18:58:33.744761 ip-10-0-128-95 kubenswrapper[2583]: E0416 18:58:33.744725 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42f149ebb69b147f22d2a93da3ccfda69787b08f7eefaf8dabdaaa3de61fa95d\": container with ID starting with 42f149ebb69b147f22d2a93da3ccfda69787b08f7eefaf8dabdaaa3de61fa95d not found: ID does not exist" containerID="42f149ebb69b147f22d2a93da3ccfda69787b08f7eefaf8dabdaaa3de61fa95d" Apr 16 18:58:33.744905 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.744885 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42f149ebb69b147f22d2a93da3ccfda69787b08f7eefaf8dabdaaa3de61fa95d"} err="failed to get container status \"42f149ebb69b147f22d2a93da3ccfda69787b08f7eefaf8dabdaaa3de61fa95d\": rpc error: code = NotFound desc = could not find container \"42f149ebb69b147f22d2a93da3ccfda69787b08f7eefaf8dabdaaa3de61fa95d\": container with ID starting with 42f149ebb69b147f22d2a93da3ccfda69787b08f7eefaf8dabdaaa3de61fa95d not found: ID does not exist" Apr 16 18:58:33.993329 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.993247 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fcbb6aa-4eba-4397-80a4-53930b788235" path="/var/lib/kubelet/pods/3fcbb6aa-4eba-4397-80a4-53930b788235/volumes" Apr 16 18:58:33.994363 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:33.994332 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e204d4-1573-43dd-be90-26ce38773d6d" path="/var/lib/kubelet/pods/c9e204d4-1573-43dd-be90-26ce38773d6d/volumes" Apr 16 18:58:34.546891 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:34.546861 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2c160745-48d9-4d31-823f-b47ca84ded99/alertmanager/0.log" Apr 16 18:58:34.578497 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:34.578469 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2c160745-48d9-4d31-823f-b47ca84ded99/config-reloader/0.log" Apr 16 18:58:34.604731 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:34.604692 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2c160745-48d9-4d31-823f-b47ca84ded99/kube-rbac-proxy-web/0.log" Apr 16 18:58:34.632357 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:34.632277 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2c160745-48d9-4d31-823f-b47ca84ded99/kube-rbac-proxy/0.log" Apr 16 18:58:34.662444 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:34.662408 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2c160745-48d9-4d31-823f-b47ca84ded99/kube-rbac-proxy-metric/0.log" Apr 16 18:58:34.689381 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:34.689265 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2c160745-48d9-4d31-823f-b47ca84ded99/prom-label-proxy/0.log" Apr 16 18:58:34.719695 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:34.719662 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2c160745-48d9-4d31-823f-b47ca84ded99/init-config-reloader/0.log" Apr 16 18:58:34.941424 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:34.941388 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j95hp_935b67f9-01a4-4c36-99a5-76ff15afe07f/node-exporter/0.log" Apr 16 18:58:34.967980 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:34.967951 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j95hp_935b67f9-01a4-4c36-99a5-76ff15afe07f/kube-rbac-proxy/0.log" Apr 16 18:58:34.997249 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:34.997223 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j95hp_935b67f9-01a4-4c36-99a5-76ff15afe07f/init-textfile/0.log" Apr 16 18:58:35.205160 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.205070 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-9wqsc_bf5cf0e6-a55b-42f8-b609-f62502a6a48b/kube-rbac-proxy-main/0.log" Apr 16 18:58:35.231052 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.231022 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-9wqsc_bf5cf0e6-a55b-42f8-b609-f62502a6a48b/kube-rbac-proxy-self/0.log" Apr 16 18:58:35.257813 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.257782 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-9wqsc_bf5cf0e6-a55b-42f8-b609-f62502a6a48b/openshift-state-metrics/0.log" Apr 16 18:58:35.310473 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.310436 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46/prometheus/0.log" Apr 16 18:58:35.332062 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.332035 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46/config-reloader/0.log" Apr 16 18:58:35.356630 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.356599 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46/thanos-sidecar/0.log" Apr 16 18:58:35.380881 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.380852 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46/kube-rbac-proxy-web/0.log" Apr 16 18:58:35.407856 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.407827 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46/kube-rbac-proxy/0.log" Apr 16 18:58:35.433297 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.433269 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46/kube-rbac-proxy-thanos/0.log" Apr 16 18:58:35.459045 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.458952 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a16dd21a-5ad7-40e1-8bbb-fa745f3f4c46/init-config-reloader/0.log" Apr 16 18:58:35.494088 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.494057 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-qjv4v_add4f806-9183-43ee-8bbc-9162c6bd6dc1/prometheus-operator/0.log" Apr 16 18:58:35.516435 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.516402 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-qjv4v_add4f806-9183-43ee-8bbc-9162c6bd6dc1/kube-rbac-proxy/0.log" Apr 16 18:58:35.579804 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.579772 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6b488c7c54-zh9bb_dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c/telemeter-client/0.log" Apr 16 18:58:35.603094 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.603057 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6b488c7c54-zh9bb_dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c/reload/0.log" Apr 16 18:58:35.627895 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:35.627868 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6b488c7c54-zh9bb_dc3eddbc-d2f4-4ce2-8cd0-8e12366a208c/kube-rbac-proxy/0.log" Apr 16 18:58:37.897254 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:37.897228 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f55d4c9bf-96jqv_23874d6f-d723-4869-a1ef-da1eef2c795b/console/0.log" Apr 16 18:58:38.219463 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.219378 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8"] Apr 16 18:58:38.219887 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.219861 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fcbb6aa-4eba-4397-80a4-53930b788235" containerName="switch-graph-dbc56" Apr 16 18:58:38.219887 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.219888 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcbb6aa-4eba-4397-80a4-53930b788235" containerName="switch-graph-dbc56" Apr 16 18:58:38.220084 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.219926 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9e204d4-1573-43dd-be90-26ce38773d6d" containerName="copy" Apr 16 18:58:38.220084 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.219935 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e204d4-1573-43dd-be90-26ce38773d6d" containerName="copy" Apr 16 18:58:38.220084 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.219952 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9e204d4-1573-43dd-be90-26ce38773d6d" containerName="gather" Apr 16 18:58:38.220084 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.219961 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e204d4-1573-43dd-be90-26ce38773d6d" containerName="gather" Apr 16 18:58:38.220084 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.220032 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fcbb6aa-4eba-4397-80a4-53930b788235" containerName="switch-graph-dbc56" Apr 16 18:58:38.220084 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.220048 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9e204d4-1573-43dd-be90-26ce38773d6d" containerName="copy" Apr 16 18:58:38.220084 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.220060 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9e204d4-1573-43dd-be90-26ce38773d6d" containerName="gather" Apr 16 18:58:38.224402 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.224374 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.231354 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.231327 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8"] Apr 16 18:58:38.299916 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.299877 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8c20f20-1134-4c25-8e49-e34edce880d2-sys\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.300282 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.300260 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c8c20f20-1134-4c25-8e49-e34edce880d2-podres\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.300500 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.300481 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c8c20f20-1134-4c25-8e49-e34edce880d2-proc\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.300694 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.300673 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c8c20f20-1134-4c25-8e49-e34edce880d2-lib-modules\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.300779 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.300742 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx6xr\" (UniqueName: \"kubernetes.io/projected/c8c20f20-1134-4c25-8e49-e34edce880d2-kube-api-access-xx6xr\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.401665 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.401630 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c8c20f20-1134-4c25-8e49-e34edce880d2-podres\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.401828 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.401681 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c8c20f20-1134-4c25-8e49-e34edce880d2-proc\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.401828 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.401721 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c8c20f20-1134-4c25-8e49-e34edce880d2-lib-modules\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.401828 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.401747 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx6xr\" (UniqueName: \"kubernetes.io/projected/c8c20f20-1134-4c25-8e49-e34edce880d2-kube-api-access-xx6xr\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.401828 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.401799 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c8c20f20-1134-4c25-8e49-e34edce880d2-podres\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.401828 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.401818 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8c20f20-1134-4c25-8e49-e34edce880d2-sys\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.402015 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.401845 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c8c20f20-1134-4c25-8e49-e34edce880d2-proc\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.402015 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.401884 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8c20f20-1134-4c25-8e49-e34edce880d2-sys\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.402015 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.401890 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c8c20f20-1134-4c25-8e49-e34edce880d2-lib-modules\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.411215 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.411160 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx6xr\" (UniqueName: \"kubernetes.io/projected/c8c20f20-1134-4c25-8e49-e34edce880d2-kube-api-access-xx6xr\") pod \"perf-node-gather-daemonset-d4sh8\" (UID: \"c8c20f20-1134-4c25-8e49-e34edce880d2\") " pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.537856 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.537777 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:38.677943 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.677907 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8"] Apr 16 18:58:38.683537 ip-10-0-128-95 kubenswrapper[2583]: W0416 18:58:38.683509 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc8c20f20_1134_4c25_8e49_e34edce880d2.slice/crio-19e4aebc6de79865e7ff6796674bcea98b77721aa9a8f131ae9817b04775a912 WatchSource:0}: Error finding container 19e4aebc6de79865e7ff6796674bcea98b77721aa9a8f131ae9817b04775a912: Status 404 returned error can't find the container with id 19e4aebc6de79865e7ff6796674bcea98b77721aa9a8f131ae9817b04775a912 Apr 16 18:58:38.709079 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:38.709053 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" event={"ID":"c8c20f20-1134-4c25-8e49-e34edce880d2","Type":"ContainerStarted","Data":"19e4aebc6de79865e7ff6796674bcea98b77721aa9a8f131ae9817b04775a912"} Apr 16 18:58:39.207954 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:39.207923 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zkgtz_f60a69fc-3609-441f-9761-47098d24b5d0/dns/0.log" Apr 16 18:58:39.249248 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:39.249216 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zkgtz_f60a69fc-3609-441f-9761-47098d24b5d0/kube-rbac-proxy/0.log" Apr 16 18:58:39.325166 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:39.325139 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hhl4h_c28c1588-a1f5-4491-bbfc-135c1d264663/dns-node-resolver/0.log" Apr 16 18:58:39.713790 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:39.713749 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" event={"ID":"c8c20f20-1134-4c25-8e49-e34edce880d2","Type":"ContainerStarted","Data":"7d7c5eee3de1c1022244415393ee003767d2df0568388bc036beda5cc9533a21"} Apr 16 18:58:39.714016 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:39.714002 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:39.733340 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:39.733281 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" podStartSLOduration=1.733265463 podStartE2EDuration="1.733265463s" podCreationTimestamp="2026-04-16 18:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:58:39.731870697 +0000 UTC m=+2920.321068985" watchObservedRunningTime="2026-04-16 18:58:39.733265463 +0000 UTC m=+2920.322463751" Apr 16 18:58:39.893298 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:39.893255 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5tctm_f7de0432-d90d-4397-aa83-1a431f35bfa6/node-ca/0.log" Apr 16 18:58:41.084054 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:41.084019 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-r62m9_dda202b0-970a-4797-9f1d-010604ebe152/serve-healthcheck-canary/0.log" Apr 16 18:58:41.564142 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:41.564112 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d9r4d_22d015bf-6f4c-43a3-9b78-b7db19a716eb/kube-rbac-proxy/0.log" Apr 16 18:58:41.589639 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:41.589609 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d9r4d_22d015bf-6f4c-43a3-9b78-b7db19a716eb/exporter/0.log" Apr 16 18:58:41.615147 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:41.615121 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d9r4d_22d015bf-6f4c-43a3-9b78-b7db19a716eb/extractor/0.log" Apr 16 18:58:44.198793 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:44.198760 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-pcp7k_1babc621-fc6c-4c0e-815b-531d596a1b08/manager/0.log" Apr 16 18:58:44.219690 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:44.219664 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-l6mrt_b7435030-4cdd-4af7-9fe7-2940c8e93876/s3-init/0.log" Apr 16 18:58:45.730142 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:45.730108 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lv8lf/perf-node-gather-daemonset-d4sh8" Apr 16 18:58:50.040051 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:50.040023 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cscsn_c3c85f82-6657-4a18-9364-c3ce61213e8a/kube-multus-additional-cni-plugins/0.log" Apr 16 18:58:50.066420 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:50.066390 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cscsn_c3c85f82-6657-4a18-9364-c3ce61213e8a/egress-router-binary-copy/0.log" Apr 16 18:58:50.091720 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:50.091676 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cscsn_c3c85f82-6657-4a18-9364-c3ce61213e8a/cni-plugins/0.log" Apr 16 18:58:50.117151 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:50.117119 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cscsn_c3c85f82-6657-4a18-9364-c3ce61213e8a/bond-cni-plugin/0.log" Apr 16 18:58:50.148405 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:50.148380 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cscsn_c3c85f82-6657-4a18-9364-c3ce61213e8a/routeoverride-cni/0.log" Apr 16 18:58:50.176005 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:50.175979 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cscsn_c3c85f82-6657-4a18-9364-c3ce61213e8a/whereabouts-cni-bincopy/0.log" Apr 16 18:58:50.203654 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:50.203628 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cscsn_c3c85f82-6657-4a18-9364-c3ce61213e8a/whereabouts-cni/0.log" Apr 16 18:58:50.810759 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:50.810709 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qtmlc_a5fef405-bf92-4991-86fe-f71befb39d59/kube-multus/0.log" Apr 16 18:58:50.890123 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:50.890100 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-chzqx_4eacb341-6891-41dc-a3c0-09b5697178ee/network-metrics-daemon/0.log" Apr 16 18:58:50.914745 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:50.914719 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-chzqx_4eacb341-6891-41dc-a3c0-09b5697178ee/kube-rbac-proxy/0.log" Apr 16 18:58:52.075815 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:52.075784 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-controller/0.log" Apr 16 18:58:52.094002 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:52.093974 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/0.log" Apr 16 18:58:52.121985 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:52.121949 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovn-acl-logging/1.log" Apr 16 18:58:52.141753 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:52.141721 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/kube-rbac-proxy-node/0.log" Apr 16 18:58:52.165979 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:52.165952 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:58:52.194897 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:52.194869 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/northd/0.log" Apr 16 18:58:52.222626 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:52.222570 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/nbdb/0.log" Apr 16 18:58:52.245531 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:52.245506 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/sbdb/0.log" Apr 16 18:58:52.454271 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:52.454235 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7ztx_e9da1b91-a9ae-4adf-ac9f-881e7217faad/ovnkube-controller/0.log" Apr 16 18:58:53.867481 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:53.867441 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9rmkx_d252d242-5753-478c-9b07-d4b27eb2d3e8/network-check-target-container/0.log" Apr 16 18:58:54.840008 ip-10-0-128-95 kubenswrapper[2583]: I0416 18:58:54.839931 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-4wkgk_fb7a4ef0-d501-4e24-b5e2-d35a8b4c3916/iptables-alerter/0.log"