Apr 16 19:54:00.204466 ip-10-0-140-191 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:54:00.650714 ip-10-0-140-191 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:54:00.650714 ip-10-0-140-191 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:54:00.650714 ip-10-0-140-191 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:54:00.650714 ip-10-0-140-191 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:54:00.650714 ip-10-0-140-191 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:54:00.653899 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.653815 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:54:00.659856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659834 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:54:00.659856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659853 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:54:00.659856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659857 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:54:00.659856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659861 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:54:00.659856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659864 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659868 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659870 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659874 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659876 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659879 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659881 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659884 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659887 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659892 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659895 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659898 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659901 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659903 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659906 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659909 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659911 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659914 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659917 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:54:00.660055 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659919 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659922 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659924 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659927 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659929 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659937 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659939 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659942 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659945 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659947 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659950 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659952 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659955 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659958 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659960 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659964 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659967 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659970 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659972 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659975 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:54:00.660541 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659978 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659980 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659984 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659986 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659989 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659991 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659994 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659996 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.659999 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660003 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660008 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660011 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660014 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660017 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660019 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660022 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660025 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660027 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660029 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660032 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:54:00.661044 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660034 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660044 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660046 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660049 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660052 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660054 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660057 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660061 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660064 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660067 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660070 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660072 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660076 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660078 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660082 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660085 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660088 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660091 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660093 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:54:00.661528 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660096 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660098 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660101 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660103 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660502 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660507 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660510 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660513 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660516 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660519 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660522 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660524 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660527 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660530 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660532 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660535 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660537 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660540 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660544 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:54:00.661981 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660548 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660552 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660555 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660557 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660560 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660563 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660565 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660568 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660571 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660574 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660576 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660579 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660582 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660584 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660587 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660589 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660592 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660594 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660596 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660599 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:54:00.662491 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660602 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660604 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660606 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660609 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660611 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660613 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660616 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660619 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660622 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660624 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660626 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660630 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660632 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660635 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660638 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660640 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660643 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660646 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660648 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:54:00.663020 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660651 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660653 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660656 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660659 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660661 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660666 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660669 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660672 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660675 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660678 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660680 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660683 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660685 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660688 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660691 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660693 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660696 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660698 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660701 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:54:00.663500 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660704 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660706 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660709 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660712 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660714 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660717 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660720 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660722 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660725 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660728 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660730 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660733 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.660735 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661495 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661508 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661515 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661524 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661531 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661534 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661540 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661545 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:54:00.663969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661548 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661551 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661555 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661558 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661561 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661564 2568 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661567 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661570 2568 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661573 2568 flags.go:64] FLAG: --cloud-config="" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661576 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661579 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661583 2568 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661586 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661589 2568 flags.go:64] FLAG: --config-dir="" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661592 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661595 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661599 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661602 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661605 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661609 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661612 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661614 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661617 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661621 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661624 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:54:00.664490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661629 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661632 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661635 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661637 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661641 2568 flags.go:64] FLAG: --enable-server="true" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661644 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661648 2568 flags.go:64] FLAG: --event-burst="100" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661651 2568 flags.go:64] FLAG: --event-qps="50" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661654 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661656 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661659 2568 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661663 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661666 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661669 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661672 2568 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661674 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661677 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661680 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661683 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661685 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661688 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661691 2568 flags.go:64] FLAG: --feature-gates="" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661695 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661698 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661702 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:54:00.665089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661705 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661708 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661711 2568 flags.go:64] FLAG: --help="false" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661714 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-140-191.ec2.internal" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661717 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661720 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661724 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661727 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661730 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661733 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661736 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661739 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661742 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661745 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661748 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661751 2568 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661753 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661757 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661759 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661762 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661765 2568 flags.go:64] FLAG: --lock-file="" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661767 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661770 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661773 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:54:00.665705 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661778 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661781 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661784 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661787 2568 flags.go:64] FLAG: --logging-format="text" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661790 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661793 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661796 2568 flags.go:64] FLAG: --manifest-url="" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661799 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661803 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661806 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661813 2568 flags.go:64] FLAG: --max-pods="110" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661816 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661819 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661822 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661825 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661828 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661831 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661833 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661840 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661843 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661846 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661850 2568 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661853 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:54:00.666303 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661859 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661862 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661865 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661867 2568 flags.go:64] FLAG: --port="10250" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661870 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661873 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0795f5469ecb12419" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661876 2568 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661880 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661883 2568 flags.go:64] FLAG: --register-node="true" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661886 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661888 2568 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661892 2568 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661895 2568 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661898 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661901 2568 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661904 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661908 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661911 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661913 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661916 2568 flags.go:64] FLAG: --runonce="false" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661919 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661922 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661925 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661928 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661931 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661933 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:54:00.666842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661937 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661940 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661942 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661945 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661948 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661951 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661954 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661957 2568 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661960 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661965 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661968 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661970 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661975 2568 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661977 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661980 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661983 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661986 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661989 2568 flags.go:64] FLAG: --v="2" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661993 2568 flags.go:64] FLAG: --version="false" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.661997 2568 flags.go:64] FLAG: --vmodule="" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.662002 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.662005 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662111 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662115 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:54:00.667454 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662118 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662121 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662124 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662127 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662131 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662133 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662136 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662139 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662142 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662144 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662147 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662151 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662154 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662158 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662161 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662164 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662180 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662184 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662187 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:54:00.668019 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662190 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662192 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662195 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662197 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662200 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662203 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662205 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662208 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662213 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662216 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662219 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662222 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662225 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662228 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662230 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662233 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662235 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662238 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662241 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662243 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:54:00.668537 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662246 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662249 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662251 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662254 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662257 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662259 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662262 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662264 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662268 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662270 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662273 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662275 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662278 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662281 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662283 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662286 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662289 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662291 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662294 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662296 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:54:00.669008 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662300 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662302 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662305 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662308 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662311 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662314 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662316 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662319 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662321 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662324 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662326 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662329 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662332 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662334 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662338 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662341 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662343 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662346 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662348 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:54:00.669518 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662351 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662354 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662357 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662360 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662362 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.662365 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.662928 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.669281 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.669385 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669432 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669437 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669440 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669443 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669446 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669450 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669453 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:54:00.669983 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669456 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669459 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669461 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669464 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669467 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669469 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669472 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669475 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669477 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669480 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669483 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669485 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669487 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669490 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669492 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669495 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669499 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669504 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669507 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:54:00.670398 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669511 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669514 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669517 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669520 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669523 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669526 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669529 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669531 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669534 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669537 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669540 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669542 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669545 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669548 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669550 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669553 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669555 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669558 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669560 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669563 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:54:00.670856 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669566 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669568 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669571 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669573 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669576 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669578 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669581 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669583 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669586 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669588 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669591 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669593 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669596 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669599 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669602 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669604 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669607 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669610 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669613 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:54:00.671357 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669615 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669618 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669621 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669623 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669626 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669629 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669633 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669636 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669639 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669641 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669644 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669646 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669649 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669651 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669654 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669657 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669659 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669662 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669664 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669667 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:54:00.671852 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669669 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.669674 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669771 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669775 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669778 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669781 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669785 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669787 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669790 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669793 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669796 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669800 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669803 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669805 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669808 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669810 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:54:00.672340 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669813 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669816 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669818 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669821 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669823 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669825 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669828 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669831 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669834 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669836 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669838 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669841 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669844 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669846 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669848 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669851 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669853 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669857 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669860 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:54:00.672736 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669864 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669866 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669869 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669872 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669875 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669878 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669881 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669885 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669888 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669891 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669894 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669896 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669899 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669901 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669904 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669907 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669909 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669911 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669914 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669916 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:54:00.673306 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669919 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669921 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669924 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669926 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669929 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669931 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669933 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669936 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669939 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669941 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669943 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669946 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669948 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669951 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669953 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669956 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669958 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669961 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669963 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669967 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:54:00.673790 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669969 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669972 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669974 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669977 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669979 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669982 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669984 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669987 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669989 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669992 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669995 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.669998 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:00.670000 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.670005 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.670763 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:54:00.674291 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.672836 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:54:00.674662 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.674382 2568 server.go:1019] "Starting client certificate rotation" Apr 16 19:54:00.674662 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.674484 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:54:00.674662 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.674527 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:54:00.699628 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.699611 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:54:00.701365 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.701348 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:54:00.716001 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.715981 2568 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:54:00.721289 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.721275 2568 log.go:25] "Validated CRI v1 image API" Apr 16 19:54:00.723024 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.723005 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:54:00.725245 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.725228 2568 fs.go:135] Filesystem UUIDs: map[320ca3ee-be46-45cc-80c6-49412d9ff61a:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 b661dc08-16b8-4575-b64e-77552ed30f3d:/dev/nvme0n1p4] Apr 16 19:54:00.725296 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.725245 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:54:00.729089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.729069 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:54:00.730666 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.730561 2568 manager.go:217] Machine: {Timestamp:2026-04-16 19:54:00.728850513 +0000 UTC m=+0.404852580 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100205 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ab5d04d0e5b3ed09479e194bec290 SystemUUID:ec2ab5d0-4d0e-5b3e-d094-79e194bec290 BootID:a5a9ac61-a6c4-400c-8e26-6e8573e75ee0 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1a:34:23:aa:4b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1a:34:23:aa:4b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:56:8b:5e:90:23:71 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:54:00.730666 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.730660 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:54:00.730784 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.730772 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:54:00.732575 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.732549 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:54:00.732735 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.732578 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-191.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:54:00.732781 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.732743 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:54:00.732781 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.732752 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:54:00.732781 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.732769 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:54:00.733565 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.733554 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:54:00.734971 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.734960 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:54:00.735083 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.735074 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:54:00.738441 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.738432 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:54:00.738474 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.738446 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:54:00.738474 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.738457 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:54:00.738474 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.738467 2568 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:54:00.738574 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.738475 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:54:00.739581 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.739569 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:54:00.739631 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.739587 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:54:00.742446 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.742429 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:54:00.744074 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.744057 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:54:00.745226 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.745212 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:54:00.745330 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.745232 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:54:00.745330 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.745242 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:54:00.745330 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.745250 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:54:00.745330 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.745258 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:54:00.745330 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.745279 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:54:00.745330 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.745289 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:54:00.745330 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.745296 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:54:00.745330 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.745306 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:54:00.745330 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.745315 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:54:00.745330 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.745333 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:54:00.745614 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.745346 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:54:00.746914 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.746892 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:54:00.746914 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.746908 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:54:00.747023 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.746934 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zhqwk" Apr 16 19:54:00.750101 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:00.750042 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-191.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:54:00.750101 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:00.750044 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:54:00.750389 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.750376 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:54:00.750449 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.750420 2568 server.go:1295] "Started kubelet" Apr 16 19:54:00.750507 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.750485 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:54:00.750615 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.750565 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:54:00.750656 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.750644 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:54:00.751265 ip-10-0-140-191 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:54:00.751999 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.751956 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-191.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:54:00.752269 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.752103 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:54:00.752404 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.752392 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:54:00.754481 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.754461 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zhqwk" Apr 16 19:54:00.758210 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.758189 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:54:00.758622 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.758605 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:54:00.759754 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.759422 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:54:00.759754 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.759443 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:54:00.759754 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.759545 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:54:00.759754 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.759612 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:54:00.759754 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.759623 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:54:00.759754 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:00.758212 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-191.ec2.internal.18a6ee69b0db7845 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-191.ec2.internal,UID:ip-10-0-140-191.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-191.ec2.internal,},FirstTimestamp:2026-04-16 19:54:00.750389317 +0000 UTC m=+0.426391391,LastTimestamp:2026-04-16 19:54:00.750389317 +0000 UTC m=+0.426391391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-191.ec2.internal,}" Apr 16 19:54:00.760368 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:00.760350 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:00.762465 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.762446 2568 factory.go:153] Registering CRI-O factory Apr 16 19:54:00.762559 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.762522 2568 factory.go:223] Registration of the crio container factory successfully Apr 16 19:54:00.762617 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.762576 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:54:00.762617 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.762585 2568 factory.go:55] Registering systemd factory Apr 16 19:54:00.762617 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.762593 2568 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:54:00.762617 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.762613 2568 factory.go:103] Registering Raw factory Apr 16 19:54:00.762801 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.762625 2568 manager.go:1196] Started watching for new ooms in manager Apr 16 19:54:00.762918 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:00.762900 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:54:00.763045 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.763031 2568 manager.go:319] Starting recovery of all containers Apr 16 19:54:00.764788 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.764770 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:00.768803 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:00.768776 2568 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-140-191.ec2.internal\" not found" node="ip-10-0-140-191.ec2.internal" Apr 16 19:54:00.770582 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.770562 2568 manager.go:324] Recovery completed Apr 16 19:54:00.775702 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.775690 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:54:00.777937 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.777921 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:54:00.778009 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.777949 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:54:00.778009 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.777959 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:54:00.778418 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.778404 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:54:00.778418 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.778415 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:54:00.778514 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.778431 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:54:00.780660 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.780647 2568 policy_none.go:49] "None policy: Start" Apr 16 19:54:00.780699 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.780664 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:54:00.780699 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.780674 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:54:00.817722 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.817706 2568 manager.go:341] "Starting Device Plugin manager" Apr 16 19:54:00.831836 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:00.817766 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:54:00.831836 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.817781 2568 server.go:85] "Starting device plugin registration server" Apr 16 19:54:00.831836 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.818011 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:54:00.831836 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.818024 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:54:00.831836 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.818156 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:54:00.831836 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.818269 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:54:00.831836 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.818278 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:54:00.831836 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:00.819017 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:54:00.831836 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:00.819048 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:00.891122 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.891092 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:54:00.892259 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.892244 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:54:00.892350 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.892266 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:54:00.892350 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.892281 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:54:00.892350 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.892289 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:54:00.892350 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:00.892321 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:54:00.895226 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.895206 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:00.918192 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.918136 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:54:00.919234 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.919219 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:54:00.919292 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.919249 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:54:00.919292 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.919259 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:54:00.919292 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.919280 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-191.ec2.internal" Apr 16 19:54:00.928183 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.928154 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-191.ec2.internal" Apr 16 19:54:00.928242 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:00.928191 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-191.ec2.internal\": node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:00.946860 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:00.946838 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:00.992408 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.992374 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-191.ec2.internal"] Apr 16 19:54:00.992469 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.992447 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:54:00.993186 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.993155 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:54:00.993244 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.993202 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:54:00.993244 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.993214 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:54:00.994389 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.994378 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:54:00.994506 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.994493 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal" Apr 16 19:54:00.994553 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.994519 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:54:00.995010 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.994994 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:54:00.995085 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.995013 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:54:00.995085 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.995022 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:54:00.995085 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.995032 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:54:00.995085 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.995038 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:54:00.995085 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.995047 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:54:00.996374 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.996361 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-191.ec2.internal" Apr 16 19:54:00.996423 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.996385 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:54:00.997032 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.997010 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:54:00.997110 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.997041 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:54:00.997110 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:00.997054 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:54:01.018668 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:01.018653 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-191.ec2.internal\" not found" node="ip-10-0-140-191.ec2.internal" Apr 16 19:54:01.023002 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:01.022987 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-191.ec2.internal\" not found" node="ip-10-0-140-191.ec2.internal" Apr 16 19:54:01.047581 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:01.047561 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:01.061031 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.061007 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9c0ec544a6cdf582befbb1945cc59559-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal\" (UID: \"9c0ec544a6cdf582befbb1945cc59559\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal" Apr 16 19:54:01.061085 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.061046 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c0ec544a6cdf582befbb1945cc59559-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal\" (UID: \"9c0ec544a6cdf582befbb1945cc59559\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal" Apr 16 19:54:01.061085 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.061074 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ab1d9840308a9e93932284ca7f6a67ee-config\") pod \"kube-apiserver-proxy-ip-10-0-140-191.ec2.internal\" (UID: \"ab1d9840308a9e93932284ca7f6a67ee\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-191.ec2.internal" Apr 16 19:54:01.148563 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:01.148538 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:01.161413 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.161387 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9c0ec544a6cdf582befbb1945cc59559-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal\" (UID: \"9c0ec544a6cdf582befbb1945cc59559\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal" Apr 16 19:54:01.161483 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.161419 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c0ec544a6cdf582befbb1945cc59559-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal\" (UID: \"9c0ec544a6cdf582befbb1945cc59559\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal" Apr 16 19:54:01.161483 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.161440 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ab1d9840308a9e93932284ca7f6a67ee-config\") pod \"kube-apiserver-proxy-ip-10-0-140-191.ec2.internal\" (UID: \"ab1d9840308a9e93932284ca7f6a67ee\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-191.ec2.internal" Apr 16 19:54:01.161550 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.161500 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ab1d9840308a9e93932284ca7f6a67ee-config\") pod \"kube-apiserver-proxy-ip-10-0-140-191.ec2.internal\" (UID: \"ab1d9840308a9e93932284ca7f6a67ee\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-191.ec2.internal" Apr 16 19:54:01.161550 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.161505 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c0ec544a6cdf582befbb1945cc59559-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal\" (UID: \"9c0ec544a6cdf582befbb1945cc59559\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal" Apr 16 19:54:01.161550 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.161513 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9c0ec544a6cdf582befbb1945cc59559-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal\" (UID: \"9c0ec544a6cdf582befbb1945cc59559\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal" Apr 16 19:54:01.249262 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:01.249200 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:01.320663 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.320642 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal" Apr 16 19:54:01.325249 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.325232 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-191.ec2.internal" Apr 16 19:54:01.349947 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:01.349926 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:01.450466 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:01.450442 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:01.550931 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:01.550881 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:01.651483 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:01.651460 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:01.673907 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.673886 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:54:01.674039 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.674020 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:54:01.674092 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.674034 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:54:01.752504 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:01.752481 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:01.758474 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.758457 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:54:01.758908 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.758877 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:49:00 +0000 UTC" deadline="2028-01-08 01:30:49.77605673 +0000 UTC" Apr 16 19:54:01.759006 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.758914 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15149h36m48.017156265s" Apr 16 19:54:01.771789 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.771766 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:54:01.794242 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.794221 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8scdw" Apr 16 19:54:01.799919 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.799902 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8scdw" Apr 16 19:54:01.853552 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:01.853499 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:01.878615 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:01.878580 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab1d9840308a9e93932284ca7f6a67ee.slice/crio-7abcc3e5e0be914aae73cf562b92ed8d346a87e63bf2456fbcde2afec52934bb WatchSource:0}: Error finding container 7abcc3e5e0be914aae73cf562b92ed8d346a87e63bf2456fbcde2afec52934bb: Status 404 returned error can't find the container with id 7abcc3e5e0be914aae73cf562b92ed8d346a87e63bf2456fbcde2afec52934bb Apr 16 19:54:01.879125 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:01.879098 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c0ec544a6cdf582befbb1945cc59559.slice/crio-f13f632a2f142707b907eb92dffd2868546b6582dcdb888b58a303b429b33986 WatchSource:0}: Error finding container f13f632a2f142707b907eb92dffd2868546b6582dcdb888b58a303b429b33986: Status 404 returned error can't find the container with id f13f632a2f142707b907eb92dffd2868546b6582dcdb888b58a303b429b33986 Apr 16 19:54:01.882664 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.882648 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:54:01.895521 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.895483 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-191.ec2.internal" event={"ID":"ab1d9840308a9e93932284ca7f6a67ee","Type":"ContainerStarted","Data":"7abcc3e5e0be914aae73cf562b92ed8d346a87e63bf2456fbcde2afec52934bb"} Apr 16 19:54:01.896389 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.896368 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal" event={"ID":"9c0ec544a6cdf582befbb1945cc59559","Type":"ContainerStarted","Data":"f13f632a2f142707b907eb92dffd2868546b6582dcdb888b58a303b429b33986"} Apr 16 19:54:01.948436 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:01.948414 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:01.953568 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:01.953549 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:02.054022 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:02.053999 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:02.154609 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:02.154537 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:02.255331 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:02.255297 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-191.ec2.internal\" not found" Apr 16 19:54:02.347763 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.347708 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:02.359659 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.359635 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal" Apr 16 19:54:02.371506 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.371402 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:54:02.372494 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.372299 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-191.ec2.internal" Apr 16 19:54:02.381292 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.381188 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:54:02.570969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.570940 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:02.739604 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.739574 2568 apiserver.go:52] "Watching apiserver" Apr 16 19:54:02.750542 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.750520 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:54:02.750937 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.750913 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cspns","openshift-image-registry/node-ca-mcvlm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal","openshift-multus/multus-5zhw7","openshift-multus/network-metrics-daemon-v62bb","openshift-network-diagnostics/network-check-target-dpc5h","kube-system/konnectivity-agent-nnsm9","kube-system/kube-apiserver-proxy-ip-10-0-140-191.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr","openshift-cluster-node-tuning-operator/tuned-cqztz","openshift-multus/multus-additional-cni-plugins-p856m","openshift-network-operator/iptables-alerter-4fs8r"] Apr 16 19:54:02.752533 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.752512 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.753884 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.753826 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mcvlm" Apr 16 19:54:02.755239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.755219 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nnsm9" Apr 16 19:54:02.756347 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.756328 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:54:02.756447 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.756409 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hdjfc\"" Apr 16 19:54:02.756713 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.756686 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.759377 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.758243 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:54:02.759377 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.758534 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:54:02.759377 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.758581 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:54:02.759377 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.758734 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:54:02.759377 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.758814 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:54:02.760251 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.759649 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.760251 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.760099 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:54:02.760251 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.760121 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:54:02.760251 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.760215 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-q6cfv\"" Apr 16 19:54:02.760251 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.760222 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:54:02.761238 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.761219 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.762556 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.762538 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.763975 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.763956 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:02.764073 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:02.764048 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:02.765263 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.765245 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:02.765344 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:02.765304 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:02.766300 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.766282 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:54:02.766588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.766571 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4fs8r" Apr 16 19:54:02.766976 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.766957 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:02.766976 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.766966 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:54:02.767107 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.766979 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:54:02.767107 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.766991 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:54:02.767107 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.767010 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:54:02.767444 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.767425 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:02.771208 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771187 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-sys-fs\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.771294 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771246 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-run-netns\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.771294 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771281 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-run\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.771429 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771308 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf55464c-e6ac-41d5-98de-59d9df6a82e0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.771429 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771334 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-systemd-units\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.771429 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771358 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-device-dir\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.771429 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771384 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-run-systemd\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.771429 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771407 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ede74e32-9e13-4250-9116-a7ce9f6af0a6-ovnkube-config\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.771429 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771428 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-lib-modules\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.771681 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771490 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-slash\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.771681 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771513 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-run-ovn\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.771681 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771554 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5b914463-981e-407b-9d5c-37f855389e30-tmp\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.771681 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771579 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd2sp\" (UniqueName: \"kubernetes.io/projected/bf55464c-e6ac-41d5-98de-59d9df6a82e0-kube-api-access-dd2sp\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.771681 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771650 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.771862 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771674 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4z49\" (UniqueName: \"kubernetes.io/projected/ede74e32-9e13-4250-9116-a7ce9f6af0a6-kube-api-access-c4z49\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.771862 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771707 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b352b6f3-ece0-4811-9bbf-e58c2cfe8081-serviceca\") pod \"node-ca-mcvlm\" (UID: \"b352b6f3-ece0-4811-9bbf-e58c2cfe8081\") " pod="openshift-image-registry/node-ca-mcvlm" Apr 16 19:54:02.771862 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771744 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-socket-dir\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.771862 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771785 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-registration-dir\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.771862 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771813 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-node-log\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.771862 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771832 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a-konnectivity-ca\") pod \"konnectivity-agent-nnsm9\" (UID: \"9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a\") " pod="kube-system/konnectivity-agent-nnsm9" Apr 16 19:54:02.771862 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771848 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-sysctl-d\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.771862 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771866 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-sysctl-conf\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771881 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-systemd\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771896 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf55464c-e6ac-41d5-98de-59d9df6a82e0-cnibin\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771918 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm7hq\" (UniqueName: \"kubernetes.io/projected/b352b6f3-ece0-4811-9bbf-e58c2cfe8081-kube-api-access-gm7hq\") pod \"node-ca-mcvlm\" (UID: \"b352b6f3-ece0-4811-9bbf-e58c2cfe8081\") " pod="openshift-image-registry/node-ca-mcvlm" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771940 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-host\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771956 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-kubelet\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.771971 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772011 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-sysconfig\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772057 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf55464c-e6ac-41d5-98de-59d9df6a82e0-system-cni-dir\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772089 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf55464c-e6ac-41d5-98de-59d9df6a82e0-os-release\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772113 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-cni-bin\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772139 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-cni-netd\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772164 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ede74e32-9e13-4250-9116-a7ce9f6af0a6-ovn-node-metrics-cert\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772205 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwg68\" (UniqueName: \"kubernetes.io/projected/9bb855b9-5001-4021-a934-ecc26434d057-kube-api-access-mwg68\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.772239 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772228 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-sys\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772251 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bf55464c-e6ac-41d5-98de-59d9df6a82e0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772291 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-etc-openvswitch\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772323 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-log-socket\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772343 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-kubernetes\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772360 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmc2l\" (UniqueName: \"kubernetes.io/projected/5b914463-981e-407b-9d5c-37f855389e30-kube-api-access-zmc2l\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772383 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf55464c-e6ac-41d5-98de-59d9df6a82e0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772406 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ede74e32-9e13-4250-9116-a7ce9f6af0a6-ovnkube-script-lib\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772434 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772462 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5b914463-981e-407b-9d5c-37f855389e30-etc-tuned\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772497 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b352b6f3-ece0-4811-9bbf-e58c2cfe8081-host\") pod \"node-ca-mcvlm\" (UID: \"b352b6f3-ece0-4811-9bbf-e58c2cfe8081\") " pod="openshift-image-registry/node-ca-mcvlm" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772514 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a-agent-certs\") pod \"konnectivity-agent-nnsm9\" (UID: \"9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a\") " pod="kube-system/konnectivity-agent-nnsm9" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772580 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf55464c-e6ac-41d5-98de-59d9df6a82e0-cni-binary-copy\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772616 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-etc-selinux\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772643 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-modprobe-d\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772669 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-var-lib-kubelet\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.772790 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772701 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-var-lib-openvswitch\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.773437 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772746 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-run-openvswitch\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.773437 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.772773 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ede74e32-9e13-4250-9116-a7ce9f6af0a6-env-overrides\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.774886 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.774481 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-thpgj\"" Apr 16 19:54:02.774886 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.774561 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pzhfx\"" Apr 16 19:54:02.774886 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.774637 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-94xj9\"" Apr 16 19:54:02.775120 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.775026 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:54:02.775120 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.775097 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-m46wp\"" Apr 16 19:54:02.775295 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.775125 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:54:02.775603 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.775582 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:54:02.775702 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.775656 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:54:02.775760 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.775587 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-x25sx\"" Apr 16 19:54:02.775908 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.775871 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:54:02.776005 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.775988 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:54:02.787133 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.787113 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:02.787247 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.787148 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8sc85\"" Apr 16 19:54:02.787247 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.787148 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:02.787356 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.787309 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:54:02.800488 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.800462 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:49:01 +0000 UTC" deadline="2028-01-26 22:56:42.194763172 +0000 UTC" Apr 16 19:54:02.800488 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.800486 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15603h2m39.394280393s" Apr 16 19:54:02.861319 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.861268 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:54:02.873316 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873281 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-run\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.873316 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873309 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf55464c-e6ac-41d5-98de-59d9df6a82e0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.873493 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873326 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-systemd-units\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.873493 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873351 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-var-lib-cni-bin\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.873493 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873368 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-device-dir\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.873493 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873376 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-systemd-units\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.873493 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873384 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-run-systemd\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.873493 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873448 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-device-dir\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.873493 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873450 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-run\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.873493 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873418 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-run-systemd\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.873493 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873472 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ede74e32-9e13-4250-9116-a7ce9f6af0a6-ovnkube-config\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873507 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b7555a2d-0c33-4639-a546-dc00100629cf-multus-daemon-config\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873535 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-lib-modules\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873561 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-slash\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873584 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-run-ovn\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873610 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-multus-cni-dir\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873633 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-run-multus-certs\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873669 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5b914463-981e-407b-9d5c-37f855389e30-tmp\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873701 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dd2sp\" (UniqueName: \"kubernetes.io/projected/bf55464c-e6ac-41d5-98de-59d9df6a82e0-kube-api-access-dd2sp\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873713 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-slash\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873672 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-run-ovn\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873726 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873727 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-lib-modules\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873774 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4z49\" (UniqueName: \"kubernetes.io/projected/ede74e32-9e13-4250-9116-a7ce9f6af0a6-kube-api-access-c4z49\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873796 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-run-ovn-kubernetes\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.873892 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873888 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b352b6f3-ece0-4811-9bbf-e58c2cfe8081-serviceca\") pod \"node-ca-mcvlm\" (UID: \"b352b6f3-ece0-4811-9bbf-e58c2cfe8081\") " pod="openshift-image-registry/node-ca-mcvlm" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.873937 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ede74e32-9e13-4250-9116-a7ce9f6af0a6-ovnkube-config\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874037 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-socket-dir\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874078 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-registration-dir\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874104 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-node-log\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874140 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a-konnectivity-ca\") pod \"konnectivity-agent-nnsm9\" (UID: \"9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a\") " pod="kube-system/konnectivity-agent-nnsm9" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874146 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-socket-dir\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874193 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-sysctl-d\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874204 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-registration-dir\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874217 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-sysctl-conf\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874255 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-systemd\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874284 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf55464c-e6ac-41d5-98de-59d9df6a82e0-cnibin\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874313 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-cnibin\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874544 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc946656-c72d-400e-a2cc-76aa86a4b014-host-slash\") pod \"iptables-alerter-4fs8r\" (UID: \"cc946656-c72d-400e-a2cc-76aa86a4b014\") " pod="openshift-network-operator/iptables-alerter-4fs8r" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874564 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a-konnectivity-ca\") pod \"konnectivity-agent-nnsm9\" (UID: \"9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a\") " pod="kube-system/konnectivity-agent-nnsm9" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874575 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gm7hq\" (UniqueName: \"kubernetes.io/projected/b352b6f3-ece0-4811-9bbf-e58c2cfe8081-kube-api-access-gm7hq\") pod \"node-ca-mcvlm\" (UID: \"b352b6f3-ece0-4811-9bbf-e58c2cfe8081\") " pod="openshift-image-registry/node-ca-mcvlm" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874206 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-node-log\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.874588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874361 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-systemd\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874387 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf55464c-e6ac-41d5-98de-59d9df6a82e0-cnibin\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874348 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b352b6f3-ece0-4811-9bbf-e58c2cfe8081-serviceca\") pod \"node-ca-mcvlm\" (UID: \"b352b6f3-ece0-4811-9bbf-e58c2cfe8081\") " pod="openshift-image-registry/node-ca-mcvlm" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874326 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-sysctl-d\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874353 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-sysctl-conf\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874638 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-host\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874660 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf55464c-e6ac-41d5-98de-59d9df6a82e0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874678 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-host\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874667 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-kubelet\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874714 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-kubelet\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874753 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-run-k8s-cni-cncf-io\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874798 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-var-lib-kubelet\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874830 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874848 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-sysconfig\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874886 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-sysconfig\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874905 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-system-cni-dir\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874923 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kg8d\" (UniqueName: \"kubernetes.io/projected/b7555a2d-0c33-4639-a546-dc00100629cf-kube-api-access-7kg8d\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.875355 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874920 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874944 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf55464c-e6ac-41d5-98de-59d9df6a82e0-system-cni-dir\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874964 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf55464c-e6ac-41d5-98de-59d9df6a82e0-os-release\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874978 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-cni-bin\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.874993 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-cni-netd\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875010 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ede74e32-9e13-4250-9116-a7ce9f6af0a6-ovn-node-metrics-cert\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875031 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td9n5\" (UniqueName: \"kubernetes.io/projected/cc946656-c72d-400e-a2cc-76aa86a4b014-kube-api-access-td9n5\") pod \"iptables-alerter-4fs8r\" (UID: \"cc946656-c72d-400e-a2cc-76aa86a4b014\") " pod="openshift-network-operator/iptables-alerter-4fs8r" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875049 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwg68\" (UniqueName: \"kubernetes.io/projected/9bb855b9-5001-4021-a934-ecc26434d057-kube-api-access-mwg68\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875051 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf55464c-e6ac-41d5-98de-59d9df6a82e0-os-release\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875058 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-cni-bin\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875079 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-cni-netd\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875073 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf55464c-e6ac-41d5-98de-59d9df6a82e0-system-cni-dir\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875083 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-sys\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bf55464c-e6ac-41d5-98de-59d9df6a82e0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875109 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-sys\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875139 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-etc-openvswitch\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875194 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-etc-openvswitch\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.876129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875221 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-log-socket\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875276 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-kubernetes\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875287 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-log-socket\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875299 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmc2l\" (UniqueName: \"kubernetes.io/projected/5b914463-981e-407b-9d5c-37f855389e30-kube-api-access-zmc2l\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875317 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf55464c-e6ac-41d5-98de-59d9df6a82e0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875336 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ede74e32-9e13-4250-9116-a7ce9f6af0a6-ovnkube-script-lib\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875339 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-kubernetes\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875381 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-os-release\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875407 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs\") pod \"network-metrics-daemon-v62bb\" (UID: \"12ed67c2-088e-47ad-b2f4-d5da475ea9fc\") " pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875432 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv\") pod \"network-check-target-dpc5h\" (UID: \"fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4\") " pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875460 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875474 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf55464c-e6ac-41d5-98de-59d9df6a82e0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875485 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-var-lib-cni-multus\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875513 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cc946656-c72d-400e-a2cc-76aa86a4b014-iptables-alerter-script\") pod \"iptables-alerter-4fs8r\" (UID: \"cc946656-c72d-400e-a2cc-76aa86a4b014\") " pod="openshift-network-operator/iptables-alerter-4fs8r" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875555 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875563 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5b914463-981e-407b-9d5c-37f855389e30-etc-tuned\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875610 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b352b6f3-ece0-4811-9bbf-e58c2cfe8081-host\") pod \"node-ca-mcvlm\" (UID: \"b352b6f3-ece0-4811-9bbf-e58c2cfe8081\") " pod="openshift-image-registry/node-ca-mcvlm" Apr 16 19:54:02.876951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875639 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a-agent-certs\") pod \"konnectivity-agent-nnsm9\" (UID: \"9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a\") " pod="kube-system/konnectivity-agent-nnsm9" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875646 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b352b6f3-ece0-4811-9bbf-e58c2cfe8081-host\") pod \"node-ca-mcvlm\" (UID: \"b352b6f3-ece0-4811-9bbf-e58c2cfe8081\") " pod="openshift-image-registry/node-ca-mcvlm" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875668 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf55464c-e6ac-41d5-98de-59d9df6a82e0-cni-binary-copy\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875699 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7555a2d-0c33-4639-a546-dc00100629cf-cni-binary-copy\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875726 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-multus-socket-dir-parent\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875752 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-run-netns\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875780 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-etc-selinux\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875808 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-modprobe-d\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875836 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-var-lib-kubelet\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875857 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ede74e32-9e13-4250-9116-a7ce9f6af0a6-ovnkube-script-lib\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875868 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-var-lib-openvswitch\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875896 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-run-openvswitch\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875921 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ede74e32-9e13-4250-9116-a7ce9f6af0a6-env-overrides\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875926 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bf55464c-e6ac-41d5-98de-59d9df6a82e0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875951 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-etc-kubernetes\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875981 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdkwg\" (UniqueName: \"kubernetes.io/projected/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-kube-api-access-xdkwg\") pod \"network-metrics-daemon-v62bb\" (UID: \"12ed67c2-088e-47ad-b2f4-d5da475ea9fc\") " pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875985 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-var-lib-openvswitch\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.877845 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.876010 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-sys-fs\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.876015 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-etc-modprobe-d\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.875923 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-etc-selinux\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.876050 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-run-netns\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.876037 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-run-openvswitch\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.876102 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ede74e32-9e13-4250-9116-a7ce9f6af0a6-host-run-netns\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.876112 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9bb855b9-5001-4021-a934-ecc26434d057-sys-fs\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.876114 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b914463-981e-407b-9d5c-37f855389e30-var-lib-kubelet\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.876133 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-hostroot\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.876164 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-multus-conf-dir\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.876257 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf55464c-e6ac-41d5-98de-59d9df6a82e0-cni-binary-copy\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.876313 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ede74e32-9e13-4250-9116-a7ce9f6af0a6-env-overrides\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.876901 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5b914463-981e-407b-9d5c-37f855389e30-tmp\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.877973 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ede74e32-9e13-4250-9116-a7ce9f6af0a6-ovn-node-metrics-cert\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.878230 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a-agent-certs\") pod \"konnectivity-agent-nnsm9\" (UID: \"9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a\") " pod="kube-system/konnectivity-agent-nnsm9" Apr 16 19:54:02.878466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.878296 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5b914463-981e-407b-9d5c-37f855389e30-etc-tuned\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.882983 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.882900 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm7hq\" (UniqueName: \"kubernetes.io/projected/b352b6f3-ece0-4811-9bbf-e58c2cfe8081-kube-api-access-gm7hq\") pod \"node-ca-mcvlm\" (UID: \"b352b6f3-ece0-4811-9bbf-e58c2cfe8081\") " pod="openshift-image-registry/node-ca-mcvlm" Apr 16 19:54:02.882983 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.882958 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4z49\" (UniqueName: \"kubernetes.io/projected/ede74e32-9e13-4250-9116-a7ce9f6af0a6-kube-api-access-c4z49\") pod \"ovnkube-node-cspns\" (UID: \"ede74e32-9e13-4250-9116-a7ce9f6af0a6\") " pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:02.883680 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.883643 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmc2l\" (UniqueName: \"kubernetes.io/projected/5b914463-981e-407b-9d5c-37f855389e30-kube-api-access-zmc2l\") pod \"tuned-cqztz\" (UID: \"5b914463-981e-407b-9d5c-37f855389e30\") " pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:02.884083 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.884060 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwg68\" (UniqueName: \"kubernetes.io/projected/9bb855b9-5001-4021-a934-ecc26434d057-kube-api-access-mwg68\") pod \"aws-ebs-csi-driver-node-vw7dr\" (UID: \"9bb855b9-5001-4021-a934-ecc26434d057\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:02.884784 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.884762 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd2sp\" (UniqueName: \"kubernetes.io/projected/bf55464c-e6ac-41d5-98de-59d9df6a82e0-kube-api-access-dd2sp\") pod \"multus-additional-cni-plugins-p856m\" (UID: \"bf55464c-e6ac-41d5-98de-59d9df6a82e0\") " pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:02.976625 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.976588 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7555a2d-0c33-4639-a546-dc00100629cf-cni-binary-copy\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.976625 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.976631 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-multus-socket-dir-parent\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.976832 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.976692 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-multus-socket-dir-parent\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.976832 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.976724 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-run-netns\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.976832 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.976752 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-etc-kubernetes\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.976832 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.976777 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdkwg\" (UniqueName: \"kubernetes.io/projected/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-kube-api-access-xdkwg\") pod \"network-metrics-daemon-v62bb\" (UID: \"12ed67c2-088e-47ad-b2f4-d5da475ea9fc\") " pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:02.976832 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.976801 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-hostroot\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977011 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.976857 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-etc-kubernetes\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977011 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.976910 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-hostroot\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977011 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.976949 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-run-netns\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977117 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977010 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-multus-conf-dir\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977117 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-var-lib-cni-bin\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977117 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977067 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b7555a2d-0c33-4639-a546-dc00100629cf-multus-daemon-config\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977117 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977099 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-multus-cni-dir\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977120 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-multus-conf-dir\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977127 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-run-multus-certs\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977160 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-run-multus-certs\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977196 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7555a2d-0c33-4639-a546-dc00100629cf-cni-binary-copy\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977212 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-cnibin\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977221 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-multus-cni-dir\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977208 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-var-lib-cni-bin\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977242 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc946656-c72d-400e-a2cc-76aa86a4b014-host-slash\") pod \"iptables-alerter-4fs8r\" (UID: \"cc946656-c72d-400e-a2cc-76aa86a4b014\") " pod="openshift-network-operator/iptables-alerter-4fs8r" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977262 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-run-k8s-cni-cncf-io\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977264 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc946656-c72d-400e-a2cc-76aa86a4b014-host-slash\") pod \"iptables-alerter-4fs8r\" (UID: \"cc946656-c72d-400e-a2cc-76aa86a4b014\") " pod="openshift-network-operator/iptables-alerter-4fs8r" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977267 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-cnibin\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977279 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-var-lib-kubelet\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977287 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-run-k8s-cni-cncf-io\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977307 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-system-cni-dir\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977323 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kg8d\" (UniqueName: \"kubernetes.io/projected/b7555a2d-0c33-4639-a546-dc00100629cf-kube-api-access-7kg8d\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977323 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977330 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-var-lib-kubelet\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977941 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977344 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-td9n5\" (UniqueName: \"kubernetes.io/projected/cc946656-c72d-400e-a2cc-76aa86a4b014-kube-api-access-td9n5\") pod \"iptables-alerter-4fs8r\" (UID: \"cc946656-c72d-400e-a2cc-76aa86a4b014\") " pod="openshift-network-operator/iptables-alerter-4fs8r" Apr 16 19:54:02.977941 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977389 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-os-release\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977941 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977422 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-system-cni-dir\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977941 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977427 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs\") pod \"network-metrics-daemon-v62bb\" (UID: \"12ed67c2-088e-47ad-b2f4-d5da475ea9fc\") " pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:02.977941 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977468 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv\") pod \"network-check-target-dpc5h\" (UID: \"fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4\") " pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:02.977941 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977499 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-var-lib-cni-multus\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977941 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:02.977512 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:02.977941 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977518 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b7555a2d-0c33-4639-a546-dc00100629cf-multus-daemon-config\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977941 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977524 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cc946656-c72d-400e-a2cc-76aa86a4b014-iptables-alerter-script\") pod \"iptables-alerter-4fs8r\" (UID: \"cc946656-c72d-400e-a2cc-76aa86a4b014\") " pod="openshift-network-operator/iptables-alerter-4fs8r" Apr 16 19:54:02.977941 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977560 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-host-var-lib-cni-multus\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977941 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977509 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7555a2d-0c33-4639-a546-dc00100629cf-os-release\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:02.977941 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:02.977604 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs podName:12ed67c2-088e-47ad-b2f4-d5da475ea9fc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:03.477584048 +0000 UTC m=+3.153586121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs") pod "network-metrics-daemon-v62bb" (UID: "12ed67c2-088e-47ad-b2f4-d5da475ea9fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:02.978380 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.977965 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cc946656-c72d-400e-a2cc-76aa86a4b014-iptables-alerter-script\") pod \"iptables-alerter-4fs8r\" (UID: \"cc946656-c72d-400e-a2cc-76aa86a4b014\") " pod="openshift-network-operator/iptables-alerter-4fs8r" Apr 16 19:54:02.978938 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.978921 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:02.996738 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:02.996667 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:02.996738 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:02.996689 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:02.996738 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:02.996704 2568 projected.go:194] Error preparing data for projected volume kube-api-access-hq9qv for pod openshift-network-diagnostics/network-check-target-dpc5h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:02.996975 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:02.996781 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv podName:fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:03.496760135 +0000 UTC m=+3.172762206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hq9qv" (UniqueName: "kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv") pod "network-check-target-dpc5h" (UID: "fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:02.997939 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.997917 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdkwg\" (UniqueName: \"kubernetes.io/projected/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-kube-api-access-xdkwg\") pod \"network-metrics-daemon-v62bb\" (UID: \"12ed67c2-088e-47ad-b2f4-d5da475ea9fc\") " pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:02.998730 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:02.998701 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-td9n5\" (UniqueName: \"kubernetes.io/projected/cc946656-c72d-400e-a2cc-76aa86a4b014-kube-api-access-td9n5\") pod \"iptables-alerter-4fs8r\" (UID: \"cc946656-c72d-400e-a2cc-76aa86a4b014\") " pod="openshift-network-operator/iptables-alerter-4fs8r" Apr 16 19:54:03.000184 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.000158 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kg8d\" (UniqueName: \"kubernetes.io/projected/b7555a2d-0c33-4639-a546-dc00100629cf-kube-api-access-7kg8d\") pod \"multus-5zhw7\" (UID: \"b7555a2d-0c33-4639-a546-dc00100629cf\") " pod="openshift-multus/multus-5zhw7" Apr 16 19:54:03.072563 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.072534 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:03.078269 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.078243 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mcvlm" Apr 16 19:54:03.086824 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.086806 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nnsm9" Apr 16 19:54:03.092343 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.092321 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" Apr 16 19:54:03.099843 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.099826 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cqztz" Apr 16 19:54:03.108390 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.108371 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p856m" Apr 16 19:54:03.113958 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.113912 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5zhw7" Apr 16 19:54:03.120443 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.120426 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4fs8r" Apr 16 19:54:03.430660 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:03.430593 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bb855b9_5001_4021_a934_ecc26434d057.slice/crio-d14758ef6248f9edf78949ecf91f50d30bfad16b21a1b336b3a693a72df7d2c5 WatchSource:0}: Error finding container d14758ef6248f9edf78949ecf91f50d30bfad16b21a1b336b3a693a72df7d2c5: Status 404 returned error can't find the container with id d14758ef6248f9edf78949ecf91f50d30bfad16b21a1b336b3a693a72df7d2c5 Apr 16 19:54:03.432512 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:03.432488 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7555a2d_0c33_4639_a546_dc00100629cf.slice/crio-7d62cad65f2bdb7f2e8447768b51ef4e2e4b4610c4714f98651a81c62fe7a315 WatchSource:0}: Error finding container 7d62cad65f2bdb7f2e8447768b51ef4e2e4b4610c4714f98651a81c62fe7a315: Status 404 returned error can't find the container with id 7d62cad65f2bdb7f2e8447768b51ef4e2e4b4610c4714f98651a81c62fe7a315 Apr 16 19:54:03.433889 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:03.433733 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fa5d1b5_3f07_41ed_81f1_cd7e2a96551a.slice/crio-5c1001ef8a8b17748d04ba64c0455daa60974019101ef8b1a590cd5ae43262a5 WatchSource:0}: Error finding container 5c1001ef8a8b17748d04ba64c0455daa60974019101ef8b1a590cd5ae43262a5: Status 404 returned error can't find the container with id 5c1001ef8a8b17748d04ba64c0455daa60974019101ef8b1a590cd5ae43262a5 Apr 16 19:54:03.437734 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:03.437607 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb352b6f3_ece0_4811_9bbf_e58c2cfe8081.slice/crio-a2d1a9eadf2748e218aec76a1a85632874563220ea976f6f1e6e1bbcbdcb79ae WatchSource:0}: Error finding container a2d1a9eadf2748e218aec76a1a85632874563220ea976f6f1e6e1bbcbdcb79ae: Status 404 returned error can't find the container with id a2d1a9eadf2748e218aec76a1a85632874563220ea976f6f1e6e1bbcbdcb79ae Apr 16 19:54:03.438464 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:03.438438 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc946656_c72d_400e_a2cc_76aa86a4b014.slice/crio-8e7d95782c1df53949782aed9e288be3acaa65f5c4eb3676be3a59a6790f8a80 WatchSource:0}: Error finding container 8e7d95782c1df53949782aed9e288be3acaa65f5c4eb3676be3a59a6790f8a80: Status 404 returned error can't find the container with id 8e7d95782c1df53949782aed9e288be3acaa65f5c4eb3676be3a59a6790f8a80 Apr 16 19:54:03.439479 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:03.439455 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf55464c_e6ac_41d5_98de_59d9df6a82e0.slice/crio-4e3c78305687d1d5343e0ee281126ec37f6e5822b165ad1c34f7876e4290a9cf WatchSource:0}: Error finding container 4e3c78305687d1d5343e0ee281126ec37f6e5822b165ad1c34f7876e4290a9cf: Status 404 returned error can't find the container with id 4e3c78305687d1d5343e0ee281126ec37f6e5822b165ad1c34f7876e4290a9cf Apr 16 19:54:03.440873 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:03.440840 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b914463_981e_407b_9d5c_37f855389e30.slice/crio-881f67599b3bf88adf8579e69669d5f8b60af164a0cb0b23085ec29d2330a83f WatchSource:0}: Error finding container 881f67599b3bf88adf8579e69669d5f8b60af164a0cb0b23085ec29d2330a83f: Status 404 returned error can't find the container with id 881f67599b3bf88adf8579e69669d5f8b60af164a0cb0b23085ec29d2330a83f Apr 16 19:54:03.442022 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:03.442002 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podede74e32_9e13_4250_9116_a7ce9f6af0a6.slice/crio-e63ad9dc0f86d0afdff0eac423326b2e0d68f4f12d560f14163d1e47ca03411c WatchSource:0}: Error finding container e63ad9dc0f86d0afdff0eac423326b2e0d68f4f12d560f14163d1e47ca03411c: Status 404 returned error can't find the container with id e63ad9dc0f86d0afdff0eac423326b2e0d68f4f12d560f14163d1e47ca03411c Apr 16 19:54:03.480644 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.480624 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs\") pod \"network-metrics-daemon-v62bb\" (UID: \"12ed67c2-088e-47ad-b2f4-d5da475ea9fc\") " pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:03.480738 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:03.480726 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:03.480781 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:03.480773 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs podName:12ed67c2-088e-47ad-b2f4-d5da475ea9fc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.480759468 +0000 UTC m=+4.156761522 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs") pod "network-metrics-daemon-v62bb" (UID: "12ed67c2-088e-47ad-b2f4-d5da475ea9fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:03.581814 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.581758 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv\") pod \"network-check-target-dpc5h\" (UID: \"fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4\") " pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:03.581966 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:03.581932 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:03.581966 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:03.581954 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:03.582073 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:03.581970 2568 projected.go:194] Error preparing data for projected volume kube-api-access-hq9qv for pod openshift-network-diagnostics/network-check-target-dpc5h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:03.582073 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:03.582033 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv podName:fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.582013667 +0000 UTC m=+4.258015741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hq9qv" (UniqueName: "kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv") pod "network-check-target-dpc5h" (UID: "fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:03.801399 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.801336 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:49:01 +0000 UTC" deadline="2028-01-02 03:37:57.849463314 +0000 UTC" Apr 16 19:54:03.801399 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.801372 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15007h43m54.048094249s" Apr 16 19:54:03.904501 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.904425 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-191.ec2.internal" event={"ID":"ab1d9840308a9e93932284ca7f6a67ee","Type":"ContainerStarted","Data":"48330becdb74b0a54de79a03e14a8a650a3f0561bc9443e092200575197fe98f"} Apr 16 19:54:03.908180 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.908121 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" event={"ID":"ede74e32-9e13-4250-9116-a7ce9f6af0a6","Type":"ContainerStarted","Data":"e63ad9dc0f86d0afdff0eac423326b2e0d68f4f12d560f14163d1e47ca03411c"} Apr 16 19:54:03.909450 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.909338 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p856m" event={"ID":"bf55464c-e6ac-41d5-98de-59d9df6a82e0","Type":"ContainerStarted","Data":"4e3c78305687d1d5343e0ee281126ec37f6e5822b165ad1c34f7876e4290a9cf"} Apr 16 19:54:03.919737 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.919625 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mcvlm" event={"ID":"b352b6f3-ece0-4811-9bbf-e58c2cfe8081","Type":"ContainerStarted","Data":"a2d1a9eadf2748e218aec76a1a85632874563220ea976f6f1e6e1bbcbdcb79ae"} Apr 16 19:54:03.932023 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.931636 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5zhw7" event={"ID":"b7555a2d-0c33-4639-a546-dc00100629cf","Type":"ContainerStarted","Data":"7d62cad65f2bdb7f2e8447768b51ef4e2e4b4610c4714f98651a81c62fe7a315"} Apr 16 19:54:03.937133 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.936946 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cqztz" event={"ID":"5b914463-981e-407b-9d5c-37f855389e30","Type":"ContainerStarted","Data":"881f67599b3bf88adf8579e69669d5f8b60af164a0cb0b23085ec29d2330a83f"} Apr 16 19:54:03.939561 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.939305 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4fs8r" event={"ID":"cc946656-c72d-400e-a2cc-76aa86a4b014","Type":"ContainerStarted","Data":"8e7d95782c1df53949782aed9e288be3acaa65f5c4eb3676be3a59a6790f8a80"} Apr 16 19:54:03.942765 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.942738 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nnsm9" event={"ID":"9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a","Type":"ContainerStarted","Data":"5c1001ef8a8b17748d04ba64c0455daa60974019101ef8b1a590cd5ae43262a5"} Apr 16 19:54:03.946686 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:03.946335 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" event={"ID":"9bb855b9-5001-4021-a934-ecc26434d057","Type":"ContainerStarted","Data":"d14758ef6248f9edf78949ecf91f50d30bfad16b21a1b336b3a693a72df7d2c5"} Apr 16 19:54:04.491806 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:04.491716 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs\") pod \"network-metrics-daemon-v62bb\" (UID: \"12ed67c2-088e-47ad-b2f4-d5da475ea9fc\") " pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:04.491964 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:04.491897 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:04.491964 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:04.491962 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs podName:12ed67c2-088e-47ad-b2f4-d5da475ea9fc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:06.491942793 +0000 UTC m=+6.167944847 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs") pod "network-metrics-daemon-v62bb" (UID: "12ed67c2-088e-47ad-b2f4-d5da475ea9fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:04.592723 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:04.592683 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv\") pod \"network-check-target-dpc5h\" (UID: \"fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4\") " pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:04.592936 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:04.592917 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:04.593017 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:04.592943 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:04.593017 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:04.592957 2568 projected.go:194] Error preparing data for projected volume kube-api-access-hq9qv for pod openshift-network-diagnostics/network-check-target-dpc5h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:04.593017 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:04.593015 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv podName:fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:06.592997637 +0000 UTC m=+6.268999697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hq9qv" (UniqueName: "kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv") pod "network-check-target-dpc5h" (UID: "fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:04.902674 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:04.902641 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:04.903137 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:04.902761 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:04.903271 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:04.903165 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:04.903426 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:04.903363 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:04.976925 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:04.976426 2568 generic.go:358] "Generic (PLEG): container finished" podID="9c0ec544a6cdf582befbb1945cc59559" containerID="de43db31412d7bc8fe5ce745bc88fd912e654478dde9ca0e497806e699970b9e" exitCode=0 Apr 16 19:54:04.976925 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:04.976636 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal" event={"ID":"9c0ec544a6cdf582befbb1945cc59559","Type":"ContainerDied","Data":"de43db31412d7bc8fe5ce745bc88fd912e654478dde9ca0e497806e699970b9e"} Apr 16 19:54:04.992529 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:04.992474 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-191.ec2.internal" podStartSLOduration=2.9924580560000003 podStartE2EDuration="2.992458056s" podCreationTimestamp="2026-04-16 19:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:03.918236194 +0000 UTC m=+3.594238271" watchObservedRunningTime="2026-04-16 19:54:04.992458056 +0000 UTC m=+4.668460133" Apr 16 19:54:05.984940 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:05.984904 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal" event={"ID":"9c0ec544a6cdf582befbb1945cc59559","Type":"ContainerStarted","Data":"6b5087bb104607dbe248d3d4647d50bc0eaf4bdaf12c4a963abeecc9a769db99"} Apr 16 19:54:06.002681 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:06.002297 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-191.ec2.internal" podStartSLOduration=4.002278261 podStartE2EDuration="4.002278261s" podCreationTimestamp="2026-04-16 19:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:06.001530456 +0000 UTC m=+5.677532534" watchObservedRunningTime="2026-04-16 19:54:06.002278261 +0000 UTC m=+5.678280338" Apr 16 19:54:06.508513 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:06.507875 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs\") pod \"network-metrics-daemon-v62bb\" (UID: \"12ed67c2-088e-47ad-b2f4-d5da475ea9fc\") " pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:06.508513 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:06.508074 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:06.508513 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:06.508144 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs podName:12ed67c2-088e-47ad-b2f4-d5da475ea9fc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:10.508122284 +0000 UTC m=+10.184124359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs") pod "network-metrics-daemon-v62bb" (UID: "12ed67c2-088e-47ad-b2f4-d5da475ea9fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:06.609272 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:06.609235 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv\") pod \"network-check-target-dpc5h\" (UID: \"fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4\") " pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:06.609443 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:06.609415 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:06.609443 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:06.609434 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:06.609569 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:06.609447 2568 projected.go:194] Error preparing data for projected volume kube-api-access-hq9qv for pod openshift-network-diagnostics/network-check-target-dpc5h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:06.609569 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:06.609503 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv podName:fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:10.609484455 +0000 UTC m=+10.285486514 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hq9qv" (UniqueName: "kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv") pod "network-check-target-dpc5h" (UID: "fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:06.893524 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:06.892657 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:06.893524 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:06.892802 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:06.893524 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:06.892898 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:06.893524 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:06.892985 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:08.893290 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:08.893253 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:08.893739 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:08.893393 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:08.893739 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:08.893449 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:08.893739 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:08.893568 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:10.541010 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:10.540365 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs\") pod \"network-metrics-daemon-v62bb\" (UID: \"12ed67c2-088e-47ad-b2f4-d5da475ea9fc\") " pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:10.541010 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:10.540524 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:10.541010 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:10.540589 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs podName:12ed67c2-088e-47ad-b2f4-d5da475ea9fc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:18.54056822 +0000 UTC m=+18.216570280 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs") pod "network-metrics-daemon-v62bb" (UID: "12ed67c2-088e-47ad-b2f4-d5da475ea9fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:10.641133 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:10.641062 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv\") pod \"network-check-target-dpc5h\" (UID: \"fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4\") " pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:10.641315 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:10.641258 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:10.641315 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:10.641278 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:10.641315 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:10.641290 2568 projected.go:194] Error preparing data for projected volume kube-api-access-hq9qv for pod openshift-network-diagnostics/network-check-target-dpc5h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:10.641527 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:10.641349 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv podName:fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:18.641329377 +0000 UTC m=+18.317331431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hq9qv" (UniqueName: "kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv") pod "network-check-target-dpc5h" (UID: "fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:10.893884 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:10.893325 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:10.893884 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:10.893451 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:10.894957 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:10.894810 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:10.894957 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:10.894913 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:12.893008 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:12.892967 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:12.893454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:12.892967 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:12.893454 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:12.893115 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:12.893454 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:12.893204 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:14.702886 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.702850 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-t5cbx"] Apr 16 19:54:14.705864 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.705841 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t5cbx" Apr 16 19:54:14.708744 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.708722 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wgp9m\"" Apr 16 19:54:14.708848 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.708725 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:54:14.710034 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.710014 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:54:14.769390 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.769355 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbsd\" (UniqueName: \"kubernetes.io/projected/51444d65-22cd-418d-af7c-4510ee2ee6d2-kube-api-access-ffbsd\") pod \"node-resolver-t5cbx\" (UID: \"51444d65-22cd-418d-af7c-4510ee2ee6d2\") " pod="openshift-dns/node-resolver-t5cbx" Apr 16 19:54:14.769390 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.769400 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51444d65-22cd-418d-af7c-4510ee2ee6d2-hosts-file\") pod \"node-resolver-t5cbx\" (UID: \"51444d65-22cd-418d-af7c-4510ee2ee6d2\") " pod="openshift-dns/node-resolver-t5cbx" Apr 16 19:54:14.769620 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.769439 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/51444d65-22cd-418d-af7c-4510ee2ee6d2-tmp-dir\") pod \"node-resolver-t5cbx\" (UID: \"51444d65-22cd-418d-af7c-4510ee2ee6d2\") " pod="openshift-dns/node-resolver-t5cbx" Apr 16 19:54:14.869847 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.869812 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbsd\" (UniqueName: \"kubernetes.io/projected/51444d65-22cd-418d-af7c-4510ee2ee6d2-kube-api-access-ffbsd\") pod \"node-resolver-t5cbx\" (UID: \"51444d65-22cd-418d-af7c-4510ee2ee6d2\") " pod="openshift-dns/node-resolver-t5cbx" Apr 16 19:54:14.870008 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.869861 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51444d65-22cd-418d-af7c-4510ee2ee6d2-hosts-file\") pod \"node-resolver-t5cbx\" (UID: \"51444d65-22cd-418d-af7c-4510ee2ee6d2\") " pod="openshift-dns/node-resolver-t5cbx" Apr 16 19:54:14.870008 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.869891 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/51444d65-22cd-418d-af7c-4510ee2ee6d2-tmp-dir\") pod \"node-resolver-t5cbx\" (UID: \"51444d65-22cd-418d-af7c-4510ee2ee6d2\") " pod="openshift-dns/node-resolver-t5cbx" Apr 16 19:54:14.870139 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.870008 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51444d65-22cd-418d-af7c-4510ee2ee6d2-hosts-file\") pod \"node-resolver-t5cbx\" (UID: \"51444d65-22cd-418d-af7c-4510ee2ee6d2\") " pod="openshift-dns/node-resolver-t5cbx" Apr 16 19:54:14.870347 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.870327 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/51444d65-22cd-418d-af7c-4510ee2ee6d2-tmp-dir\") pod \"node-resolver-t5cbx\" (UID: \"51444d65-22cd-418d-af7c-4510ee2ee6d2\") " pod="openshift-dns/node-resolver-t5cbx" Apr 16 19:54:14.879857 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.879820 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbsd\" (UniqueName: \"kubernetes.io/projected/51444d65-22cd-418d-af7c-4510ee2ee6d2-kube-api-access-ffbsd\") pod \"node-resolver-t5cbx\" (UID: \"51444d65-22cd-418d-af7c-4510ee2ee6d2\") " pod="openshift-dns/node-resolver-t5cbx" Apr 16 19:54:14.892680 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.892658 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:14.892813 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:14.892689 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:14.892813 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:14.892784 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:14.892954 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:14.892925 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:15.015542 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:15.015488 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t5cbx" Apr 16 19:54:16.892689 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:16.892658 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:16.892689 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:16.892673 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:16.893246 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:16.892770 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:16.893246 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:16.892907 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:18.594017 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:18.593985 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs\") pod \"network-metrics-daemon-v62bb\" (UID: \"12ed67c2-088e-47ad-b2f4-d5da475ea9fc\") " pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:18.594580 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:18.594154 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:18.594580 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:18.594252 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs podName:12ed67c2-088e-47ad-b2f4-d5da475ea9fc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:34.594230471 +0000 UTC m=+34.270232545 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs") pod "network-metrics-daemon-v62bb" (UID: "12ed67c2-088e-47ad-b2f4-d5da475ea9fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:18.694882 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:18.694847 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv\") pod \"network-check-target-dpc5h\" (UID: \"fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4\") " pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:18.695058 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:18.695006 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:18.695058 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:18.695021 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:18.695058 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:18.695034 2568 projected.go:194] Error preparing data for projected volume kube-api-access-hq9qv for pod openshift-network-diagnostics/network-check-target-dpc5h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:18.695206 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:18.695095 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv podName:fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:34.695079926 +0000 UTC m=+34.371081983 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hq9qv" (UniqueName: "kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv") pod "network-check-target-dpc5h" (UID: "fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:18.892953 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:18.892866 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:18.893123 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:18.892866 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:18.893123 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:18.893021 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:18.893123 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:18.893091 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:20.074074 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:20.073897 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51444d65_22cd_418d_af7c_4510ee2ee6d2.slice/crio-3467b5cda0fd94622940d405fbd4125d6c8b5b3323fab77ed678917a55e0a8a3 WatchSource:0}: Error finding container 3467b5cda0fd94622940d405fbd4125d6c8b5b3323fab77ed678917a55e0a8a3: Status 404 returned error can't find the container with id 3467b5cda0fd94622940d405fbd4125d6c8b5b3323fab77ed678917a55e0a8a3 Apr 16 19:54:20.894686 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:20.894420 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:20.894838 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:20.894494 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:20.894838 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:20.894733 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:20.894942 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:20.894838 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:21.011939 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.011897 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5zhw7" event={"ID":"b7555a2d-0c33-4639-a546-dc00100629cf","Type":"ContainerStarted","Data":"d6523c8cd0755f0890ed294acdd6546d9b9e98727e35d485fde4c9bcfe4ca65a"} Apr 16 19:54:21.013520 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.013491 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t5cbx" event={"ID":"51444d65-22cd-418d-af7c-4510ee2ee6d2","Type":"ContainerStarted","Data":"252fbd225532380f414fc62e86d590d83fa34792683bd85fce330d021b92b691"} Apr 16 19:54:21.013676 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.013525 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t5cbx" event={"ID":"51444d65-22cd-418d-af7c-4510ee2ee6d2","Type":"ContainerStarted","Data":"3467b5cda0fd94622940d405fbd4125d6c8b5b3323fab77ed678917a55e0a8a3"} Apr 16 19:54:21.014894 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.014863 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cqztz" event={"ID":"5b914463-981e-407b-9d5c-37f855389e30","Type":"ContainerStarted","Data":"1bb14beeaefb3beccc21000463ab52b89aa9ed3972295e9c812046e7eeb00b9b"} Apr 16 19:54:21.016441 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.016404 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nnsm9" event={"ID":"9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a","Type":"ContainerStarted","Data":"35fcde05f5f2b1af7e4e117796f191a7a7688c86c3cb2d366cfde0bfa883ef17"} Apr 16 19:54:21.017907 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.017883 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" event={"ID":"9bb855b9-5001-4021-a934-ecc26434d057","Type":"ContainerStarted","Data":"106c1c127cbfda7a01dd7abf2aca7c3275227e46c09d87955739cb804b914a7e"} Apr 16 19:54:21.020774 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.020750 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" event={"ID":"ede74e32-9e13-4250-9116-a7ce9f6af0a6","Type":"ContainerStarted","Data":"e8090392e302f1655416fdf011163e48431a472123ea4e2156c72c9094bd1347"} Apr 16 19:54:21.020879 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.020779 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" event={"ID":"ede74e32-9e13-4250-9116-a7ce9f6af0a6","Type":"ContainerStarted","Data":"0b7a5140c8b924d23776be0dcf915fb9ddbc2198c70e186b85e24572186dc615"} Apr 16 19:54:21.020879 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.020791 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" event={"ID":"ede74e32-9e13-4250-9116-a7ce9f6af0a6","Type":"ContainerStarted","Data":"1b59cab10942296fe481d491632e118c0c41df936ed708a7c62144ef4f5702af"} Apr 16 19:54:21.020879 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.020801 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" event={"ID":"ede74e32-9e13-4250-9116-a7ce9f6af0a6","Type":"ContainerStarted","Data":"88a5e3541850085edfdffeaa583a790d2542041c1383bccb2c76b6ea32564221"} Apr 16 19:54:21.020879 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.020814 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" event={"ID":"ede74e32-9e13-4250-9116-a7ce9f6af0a6","Type":"ContainerStarted","Data":"b833d730247a98de03842cbb1bfa8742ce20691db6ec2a96021b91f532b66aa2"} Apr 16 19:54:21.020879 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.020828 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" event={"ID":"ede74e32-9e13-4250-9116-a7ce9f6af0a6","Type":"ContainerStarted","Data":"264aa79aa7fa191891315c431c0316e0e0b4ce21dba56c87a20b0cb007f926e5"} Apr 16 19:54:21.022204 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.022162 2568 generic.go:358] "Generic (PLEG): container finished" podID="bf55464c-e6ac-41d5-98de-59d9df6a82e0" containerID="78c8afc0e6e88b464d601132c2746895292e8351becc33aabfe545c51e2326c8" exitCode=0 Apr 16 19:54:21.022325 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.022203 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p856m" event={"ID":"bf55464c-e6ac-41d5-98de-59d9df6a82e0","Type":"ContainerDied","Data":"78c8afc0e6e88b464d601132c2746895292e8351becc33aabfe545c51e2326c8"} Apr 16 19:54:21.023932 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.023882 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mcvlm" event={"ID":"b352b6f3-ece0-4811-9bbf-e58c2cfe8081","Type":"ContainerStarted","Data":"6179568f3307548474c550ea6f5043cc72f8aec3bda20b012199ff175f0d644c"} Apr 16 19:54:21.036327 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.036290 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5zhw7" podStartSLOduration=3.389847793 podStartE2EDuration="20.036276745s" podCreationTimestamp="2026-04-16 19:54:01 +0000 UTC" firstStartedPulling="2026-04-16 19:54:03.43702793 +0000 UTC m=+3.113029987" lastFinishedPulling="2026-04-16 19:54:20.083456875 +0000 UTC m=+19.759458939" observedRunningTime="2026-04-16 19:54:21.036116941 +0000 UTC m=+20.712119018" watchObservedRunningTime="2026-04-16 19:54:21.036276745 +0000 UTC m=+20.712278823" Apr 16 19:54:21.053613 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.053577 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-t5cbx" podStartSLOduration=7.053561992 podStartE2EDuration="7.053561992s" podCreationTimestamp="2026-04-16 19:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:21.053215745 +0000 UTC m=+20.729217821" watchObservedRunningTime="2026-04-16 19:54:21.053561992 +0000 UTC m=+20.729564069" Apr 16 19:54:21.072032 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.071998 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mcvlm" podStartSLOduration=4.487356655 podStartE2EDuration="21.071987025s" podCreationTimestamp="2026-04-16 19:54:00 +0000 UTC" firstStartedPulling="2026-04-16 19:54:03.439552706 +0000 UTC m=+3.115554759" lastFinishedPulling="2026-04-16 19:54:20.024183023 +0000 UTC m=+19.700185129" observedRunningTime="2026-04-16 19:54:21.071776004 +0000 UTC m=+20.747778081" watchObservedRunningTime="2026-04-16 19:54:21.071987025 +0000 UTC m=+20.747989106" Apr 16 19:54:21.113646 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.113605 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-cqztz" podStartSLOduration=4.492777582 podStartE2EDuration="21.113594894s" podCreationTimestamp="2026-04-16 19:54:00 +0000 UTC" firstStartedPulling="2026-04-16 19:54:03.443467953 +0000 UTC m=+3.119470011" lastFinishedPulling="2026-04-16 19:54:20.064285261 +0000 UTC m=+19.740287323" observedRunningTime="2026-04-16 19:54:21.113267575 +0000 UTC m=+20.789269650" watchObservedRunningTime="2026-04-16 19:54:21.113594894 +0000 UTC m=+20.789596972" Apr 16 19:54:21.162298 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.162091 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nnsm9" podStartSLOduration=9.116578263 podStartE2EDuration="21.162074822s" podCreationTimestamp="2026-04-16 19:54:00 +0000 UTC" firstStartedPulling="2026-04-16 19:54:03.437350007 +0000 UTC m=+3.113352064" lastFinishedPulling="2026-04-16 19:54:15.482846556 +0000 UTC m=+15.158848623" observedRunningTime="2026-04-16 19:54:21.161885831 +0000 UTC m=+20.837887911" watchObservedRunningTime="2026-04-16 19:54:21.162074822 +0000 UTC m=+20.838076899" Apr 16 19:54:21.255301 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.255271 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:54:21.828755 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.828646 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:54:21.255295912Z","UUID":"86c55cb1-2925-4782-b1fc-e718e3899099","Handler":null,"Name":"","Endpoint":""} Apr 16 19:54:21.832625 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.832598 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:54:21.832625 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:21.832630 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:54:22.027094 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:22.027051 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4fs8r" event={"ID":"cc946656-c72d-400e-a2cc-76aa86a4b014","Type":"ContainerStarted","Data":"1dfeaffb9f41aa94dc4dbddabddfb63ff6d93cad0bf1c5fa5917a10cfcb4ef6a"} Apr 16 19:54:22.029159 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:22.029117 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" event={"ID":"9bb855b9-5001-4021-a934-ecc26434d057","Type":"ContainerStarted","Data":"a47f087203f75deb0484a23fca1f3f6130c43c71c4c5546c585822a72a4d0a67"} Apr 16 19:54:22.050067 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:22.050007 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4fs8r" podStartSLOduration=4.66571999 podStartE2EDuration="21.049990649s" podCreationTimestamp="2026-04-16 19:54:01 +0000 UTC" firstStartedPulling="2026-04-16 19:54:03.44010545 +0000 UTC m=+3.116107509" lastFinishedPulling="2026-04-16 19:54:19.82437611 +0000 UTC m=+19.500378168" observedRunningTime="2026-04-16 19:54:22.049702702 +0000 UTC m=+21.725704779" watchObservedRunningTime="2026-04-16 19:54:22.049990649 +0000 UTC m=+21.725992726" Apr 16 19:54:22.893359 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:22.893332 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:22.893959 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:22.893334 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:22.893959 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:22.893448 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:22.893959 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:22.893571 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:23.033931 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:23.033879 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" event={"ID":"ede74e32-9e13-4250-9116-a7ce9f6af0a6","Type":"ContainerStarted","Data":"37ef5c772ad320e9afe807e3f6ab57d732ec6091eb0d6170a1ecbbacc336a548"} Apr 16 19:54:23.036073 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:23.036041 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" event={"ID":"9bb855b9-5001-4021-a934-ecc26434d057","Type":"ContainerStarted","Data":"14176982dc212cbf02959756f7b9cf73e4eac60907700867aa72705d3d80b025"} Apr 16 19:54:23.056676 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:23.056636 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vw7dr" podStartSLOduration=4.250015161 podStartE2EDuration="23.05662327s" podCreationTimestamp="2026-04-16 19:54:00 +0000 UTC" firstStartedPulling="2026-04-16 19:54:03.432547793 +0000 UTC m=+3.108549850" lastFinishedPulling="2026-04-16 19:54:22.239155882 +0000 UTC m=+21.915157959" observedRunningTime="2026-04-16 19:54:23.055496196 +0000 UTC m=+22.731498272" watchObservedRunningTime="2026-04-16 19:54:23.05662327 +0000 UTC m=+22.732625348" Apr 16 19:54:23.971951 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:23.971900 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nnsm9" Apr 16 19:54:24.154348 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:24.154305 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nnsm9" Apr 16 19:54:24.155092 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:24.155073 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nnsm9" Apr 16 19:54:24.892787 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:24.892664 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:24.892918 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:24.892717 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:24.892980 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:24.892913 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:24.893037 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:24.892970 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:25.042246 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:25.042210 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" event={"ID":"ede74e32-9e13-4250-9116-a7ce9f6af0a6","Type":"ContainerStarted","Data":"52570e505f658eaae4cb6e5787df688d9d6e726581896a17f4bc0f6e34966c67"} Apr 16 19:54:25.043362 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:25.043342 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nnsm9" Apr 16 19:54:25.072402 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:25.072364 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" podStartSLOduration=8.404155889 podStartE2EDuration="25.072349332s" podCreationTimestamp="2026-04-16 19:54:00 +0000 UTC" firstStartedPulling="2026-04-16 19:54:03.444523825 +0000 UTC m=+3.120525878" lastFinishedPulling="2026-04-16 19:54:20.112717266 +0000 UTC m=+19.788719321" observedRunningTime="2026-04-16 19:54:25.070745446 +0000 UTC m=+24.746747522" watchObservedRunningTime="2026-04-16 19:54:25.072349332 +0000 UTC m=+24.748351408" Apr 16 19:54:26.045040 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:26.045008 2568 generic.go:358] "Generic (PLEG): container finished" podID="bf55464c-e6ac-41d5-98de-59d9df6a82e0" containerID="0cac65ff7f3516384ea8700bf5e1123162080c79c4ecb212816d939d6b86bce6" exitCode=0 Apr 16 19:54:26.045507 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:26.045063 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p856m" event={"ID":"bf55464c-e6ac-41d5-98de-59d9df6a82e0","Type":"ContainerDied","Data":"0cac65ff7f3516384ea8700bf5e1123162080c79c4ecb212816d939d6b86bce6"} Apr 16 19:54:26.046318 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:26.045835 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:26.046318 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:26.045861 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:26.046318 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:26.045875 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:26.062225 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:26.062204 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:26.062619 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:26.062602 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:26.892812 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:26.892782 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:26.892925 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:26.892782 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:26.892999 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:26.892931 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:26.892999 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:26.892978 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:27.084183 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:27.084141 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dpc5h"] Apr 16 19:54:27.084466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:27.084296 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:27.084466 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:27.084385 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:27.087872 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:27.087848 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v62bb"] Apr 16 19:54:27.087979 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:27.087964 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:27.088109 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:27.088075 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:28.050382 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:28.050117 2568 generic.go:358] "Generic (PLEG): container finished" podID="bf55464c-e6ac-41d5-98de-59d9df6a82e0" containerID="ffe811cf03e851e5f9c1e98ca49f756a54fa8c1853bb5460d49915e44e72b6ea" exitCode=0 Apr 16 19:54:28.050517 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:28.050204 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p856m" event={"ID":"bf55464c-e6ac-41d5-98de-59d9df6a82e0","Type":"ContainerDied","Data":"ffe811cf03e851e5f9c1e98ca49f756a54fa8c1853bb5460d49915e44e72b6ea"} Apr 16 19:54:28.893390 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:28.893360 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:28.893983 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:28.893365 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:28.893983 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:28.893466 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:28.893983 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:28.893551 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:28.914613 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:28.914586 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-t5cbx_51444d65-22cd-418d-af7c-4510ee2ee6d2/dns-node-resolver/0.log" Apr 16 19:54:29.695928 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:29.695902 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mcvlm_b352b6f3-ece0-4811-9bbf-e58c2cfe8081/node-ca/0.log" Apr 16 19:54:30.057002 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:30.056967 2568 generic.go:358] "Generic (PLEG): container finished" podID="bf55464c-e6ac-41d5-98de-59d9df6a82e0" containerID="4d0142f3755fc51f1c629275f2517708e7d0a4f8c4b1858d987e9fb086681f06" exitCode=0 Apr 16 19:54:30.057533 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:30.057016 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p856m" event={"ID":"bf55464c-e6ac-41d5-98de-59d9df6a82e0","Type":"ContainerDied","Data":"4d0142f3755fc51f1c629275f2517708e7d0a4f8c4b1858d987e9fb086681f06"} Apr 16 19:54:30.893702 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:30.893667 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:30.893874 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:30.893787 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:30.893874 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:30.893820 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:30.894002 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:30.893893 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:32.892703 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:32.892673 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:32.893304 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:32.892807 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:32.893304 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:32.892861 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:32.893304 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:32.892959 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:34.612833 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:34.612799 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs\") pod \"network-metrics-daemon-v62bb\" (UID: \"12ed67c2-088e-47ad-b2f4-d5da475ea9fc\") " pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:34.613266 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:34.612905 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:34.613266 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:34.612967 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs podName:12ed67c2-088e-47ad-b2f4-d5da475ea9fc nodeName:}" failed. No retries permitted until 2026-04-16 19:55:06.612950415 +0000 UTC m=+66.288952470 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs") pod "network-metrics-daemon-v62bb" (UID: "12ed67c2-088e-47ad-b2f4-d5da475ea9fc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:34.713964 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:34.713931 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv\") pod \"network-check-target-dpc5h\" (UID: \"fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4\") " pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:34.714123 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:34.714083 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:34.714123 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:34.714104 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:34.714123 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:34.714118 2568 projected.go:194] Error preparing data for projected volume kube-api-access-hq9qv for pod openshift-network-diagnostics/network-check-target-dpc5h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:34.714266 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:34.714189 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv podName:fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:06.714157746 +0000 UTC m=+66.390159819 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-hq9qv" (UniqueName: "kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv") pod "network-check-target-dpc5h" (UID: "fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:34.893441 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:34.893353 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:34.893600 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:34.893485 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:34.893600 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:34.893539 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:34.893709 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:34.893646 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:36.071100 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:36.070928 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p856m" event={"ID":"bf55464c-e6ac-41d5-98de-59d9df6a82e0","Type":"ContainerStarted","Data":"098bbdbc35438005d26eec6208888a403dc315a2b4b854b33a511ded13b6696d"} Apr 16 19:54:36.893408 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:36.893375 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:36.893583 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:36.893465 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:36.893583 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:36.893379 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:36.893583 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:36.893537 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:37.075454 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:37.075420 2568 generic.go:358] "Generic (PLEG): container finished" podID="bf55464c-e6ac-41d5-98de-59d9df6a82e0" containerID="098bbdbc35438005d26eec6208888a403dc315a2b4b854b33a511ded13b6696d" exitCode=0 Apr 16 19:54:37.075810 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:37.075468 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p856m" event={"ID":"bf55464c-e6ac-41d5-98de-59d9df6a82e0","Type":"ContainerDied","Data":"098bbdbc35438005d26eec6208888a403dc315a2b4b854b33a511ded13b6696d"} Apr 16 19:54:38.080187 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:38.080146 2568 generic.go:358] "Generic (PLEG): container finished" podID="bf55464c-e6ac-41d5-98de-59d9df6a82e0" containerID="e38991232c9110c9b4de850a9d61d030f0c9105f40c8ccfe00a0a5dc97bdeb78" exitCode=0 Apr 16 19:54:38.080574 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:38.080215 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p856m" event={"ID":"bf55464c-e6ac-41d5-98de-59d9df6a82e0","Type":"ContainerDied","Data":"e38991232c9110c9b4de850a9d61d030f0c9105f40c8ccfe00a0a5dc97bdeb78"} Apr 16 19:54:38.893308 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:38.893272 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:38.893475 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:38.893445 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:38.893533 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:38.893483 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:38.893612 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:38.893596 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:39.087349 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:39.087316 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p856m" event={"ID":"bf55464c-e6ac-41d5-98de-59d9df6a82e0","Type":"ContainerStarted","Data":"8eeaba049849433095671c6bede333ef81b6b0f155773397103f56da33517d23"} Apr 16 19:54:39.114830 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:39.114787 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p856m" podStartSLOduration=6.669177092 podStartE2EDuration="39.114776105s" podCreationTimestamp="2026-04-16 19:54:00 +0000 UTC" firstStartedPulling="2026-04-16 19:54:03.441625466 +0000 UTC m=+3.117627520" lastFinishedPulling="2026-04-16 19:54:35.88722448 +0000 UTC m=+35.563226533" observedRunningTime="2026-04-16 19:54:39.113428852 +0000 UTC m=+38.789430927" watchObservedRunningTime="2026-04-16 19:54:39.114776105 +0000 UTC m=+38.790778181" Apr 16 19:54:40.893772 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:40.893740 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:40.894213 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:40.893844 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:40.894213 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:40.893926 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:40.894213 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:40.894038 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:42.892624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:42.892593 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:42.893041 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:42.892603 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:42.893041 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:42.892702 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:42.893041 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:42.892767 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:44.892839 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:44.892815 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:44.893313 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:44.892816 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:44.893313 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:44.892912 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:44.893313 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:44.892982 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:46.892671 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:46.892638 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:46.892671 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:46.892650 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:46.893052 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:46.892739 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:46.893052 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:46.892878 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:48.892492 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:48.892462 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:48.892854 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:48.892462 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:48.892854 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:48.892599 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:48.892854 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:48.892626 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:50.893222 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:50.893135 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:50.893587 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:50.893244 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dpc5h" podUID="fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4" Apr 16 19:54:50.893587 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:50.893323 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:50.893587 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:50.893416 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v62bb" podUID="12ed67c2-088e-47ad-b2f4-d5da475ea9fc" Apr 16 19:54:51.163846 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.163779 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-191.ec2.internal" event="NodeReady" Apr 16 19:54:51.163992 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.163873 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:54:51.271006 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.270972 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qsrkv"] Apr 16 19:54:51.300587 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.300500 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5xgdl"] Apr 16 19:54:51.300694 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.300608 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.306125 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.306098 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:54:51.306254 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.306147 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:54:51.306254 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.306188 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:54:51.306368 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.306187 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-676kc\"" Apr 16 19:54:51.306524 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.306506 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:54:51.318874 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.318852 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qsrkv"] Apr 16 19:54:51.318959 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.318939 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5xgdl" Apr 16 19:54:51.321810 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.321793 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:54:51.321928 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.321913 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:54:51.322234 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.322220 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mmrlb\"" Apr 16 19:54:51.326789 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.326773 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:54:51.336848 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.336828 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5xgdl"] Apr 16 19:54:51.341589 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.341568 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jhdrk"] Apr 16 19:54:51.365520 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.365500 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.367801 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.367782 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jhdrk"] Apr 16 19:54:51.368216 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.368197 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dknqq\"" Apr 16 19:54:51.368314 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.368253 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:54:51.368314 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.368280 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:54:51.429570 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.429513 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1050de2f-179f-4060-953f-7c1c76584100-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.429570 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.429549 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1050de2f-179f-4060-953f-7c1c76584100-crio-socket\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.429707 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.429615 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1050de2f-179f-4060-953f-7c1c76584100-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.429707 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.429639 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm67z\" (UniqueName: \"kubernetes.io/projected/fded5762-e4ff-4f63-94bd-04c5209ebead-kube-api-access-wm67z\") pod \"ingress-canary-5xgdl\" (UID: \"fded5762-e4ff-4f63-94bd-04c5209ebead\") " pod="openshift-ingress-canary/ingress-canary-5xgdl" Apr 16 19:54:51.429805 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.429708 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1050de2f-179f-4060-953f-7c1c76584100-data-volume\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.429805 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.429734 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fded5762-e4ff-4f63-94bd-04c5209ebead-cert\") pod \"ingress-canary-5xgdl\" (UID: \"fded5762-e4ff-4f63-94bd-04c5209ebead\") " pod="openshift-ingress-canary/ingress-canary-5xgdl" Apr 16 19:54:51.429805 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.429788 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlh5z\" (UniqueName: \"kubernetes.io/projected/1050de2f-179f-4060-953f-7c1c76584100-kube-api-access-jlh5z\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.530624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.530597 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcndb\" (UniqueName: \"kubernetes.io/projected/d94f7b13-7594-474a-a4d5-fc1f6d448d66-kube-api-access-vcndb\") pod \"dns-default-jhdrk\" (UID: \"d94f7b13-7594-474a-a4d5-fc1f6d448d66\") " pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.530725 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.530648 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1050de2f-179f-4060-953f-7c1c76584100-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.530725 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.530665 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wm67z\" (UniqueName: \"kubernetes.io/projected/fded5762-e4ff-4f63-94bd-04c5209ebead-kube-api-access-wm67z\") pod \"ingress-canary-5xgdl\" (UID: \"fded5762-e4ff-4f63-94bd-04c5209ebead\") " pod="openshift-ingress-canary/ingress-canary-5xgdl" Apr 16 19:54:51.530836 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.530718 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1050de2f-179f-4060-953f-7c1c76584100-data-volume\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.530836 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.530750 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fded5762-e4ff-4f63-94bd-04c5209ebead-cert\") pod \"ingress-canary-5xgdl\" (UID: \"fded5762-e4ff-4f63-94bd-04c5209ebead\") " pod="openshift-ingress-canary/ingress-canary-5xgdl" Apr 16 19:54:51.530836 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.530787 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlh5z\" (UniqueName: \"kubernetes.io/projected/1050de2f-179f-4060-953f-7c1c76584100-kube-api-access-jlh5z\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.530836 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.530815 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d94f7b13-7594-474a-a4d5-fc1f6d448d66-metrics-tls\") pod \"dns-default-jhdrk\" (UID: \"d94f7b13-7594-474a-a4d5-fc1f6d448d66\") " pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.531021 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.530843 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d94f7b13-7594-474a-a4d5-fc1f6d448d66-config-volume\") pod \"dns-default-jhdrk\" (UID: \"d94f7b13-7594-474a-a4d5-fc1f6d448d66\") " pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.531021 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.530962 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1050de2f-179f-4060-953f-7c1c76584100-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.531021 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.531003 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1050de2f-179f-4060-953f-7c1c76584100-crio-socket\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.531140 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.531032 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d94f7b13-7594-474a-a4d5-fc1f6d448d66-tmp-dir\") pod \"dns-default-jhdrk\" (UID: \"d94f7b13-7594-474a-a4d5-fc1f6d448d66\") " pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.531140 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.531048 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1050de2f-179f-4060-953f-7c1c76584100-data-volume\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.531140 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.531086 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1050de2f-179f-4060-953f-7c1c76584100-crio-socket\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.531429 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.531406 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1050de2f-179f-4060-953f-7c1c76584100-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.534368 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.534345 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1050de2f-179f-4060-953f-7c1c76584100-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.534444 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.534408 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fded5762-e4ff-4f63-94bd-04c5209ebead-cert\") pod \"ingress-canary-5xgdl\" (UID: \"fded5762-e4ff-4f63-94bd-04c5209ebead\") " pod="openshift-ingress-canary/ingress-canary-5xgdl" Apr 16 19:54:51.552923 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.552904 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm67z\" (UniqueName: \"kubernetes.io/projected/fded5762-e4ff-4f63-94bd-04c5209ebead-kube-api-access-wm67z\") pod \"ingress-canary-5xgdl\" (UID: \"fded5762-e4ff-4f63-94bd-04c5209ebead\") " pod="openshift-ingress-canary/ingress-canary-5xgdl" Apr 16 19:54:51.555050 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.555030 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlh5z\" (UniqueName: \"kubernetes.io/projected/1050de2f-179f-4060-953f-7c1c76584100-kube-api-access-jlh5z\") pod \"insights-runtime-extractor-qsrkv\" (UID: \"1050de2f-179f-4060-953f-7c1c76584100\") " pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.608943 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.608921 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qsrkv" Apr 16 19:54:51.626596 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.626575 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5xgdl" Apr 16 19:54:51.631367 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.631344 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d94f7b13-7594-474a-a4d5-fc1f6d448d66-metrics-tls\") pod \"dns-default-jhdrk\" (UID: \"d94f7b13-7594-474a-a4d5-fc1f6d448d66\") " pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.631424 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.631386 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d94f7b13-7594-474a-a4d5-fc1f6d448d66-config-volume\") pod \"dns-default-jhdrk\" (UID: \"d94f7b13-7594-474a-a4d5-fc1f6d448d66\") " pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.631459 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.631435 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d94f7b13-7594-474a-a4d5-fc1f6d448d66-tmp-dir\") pod \"dns-default-jhdrk\" (UID: \"d94f7b13-7594-474a-a4d5-fc1f6d448d66\") " pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.631502 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.631466 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcndb\" (UniqueName: \"kubernetes.io/projected/d94f7b13-7594-474a-a4d5-fc1f6d448d66-kube-api-access-vcndb\") pod \"dns-default-jhdrk\" (UID: \"d94f7b13-7594-474a-a4d5-fc1f6d448d66\") " pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.633414 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.633399 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d94f7b13-7594-474a-a4d5-fc1f6d448d66-metrics-tls\") pod \"dns-default-jhdrk\" (UID: \"d94f7b13-7594-474a-a4d5-fc1f6d448d66\") " pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.646808 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.646777 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d94f7b13-7594-474a-a4d5-fc1f6d448d66-tmp-dir\") pod \"dns-default-jhdrk\" (UID: \"d94f7b13-7594-474a-a4d5-fc1f6d448d66\") " pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.646945 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.646927 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d94f7b13-7594-474a-a4d5-fc1f6d448d66-config-volume\") pod \"dns-default-jhdrk\" (UID: \"d94f7b13-7594-474a-a4d5-fc1f6d448d66\") " pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.648844 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.648821 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcndb\" (UniqueName: \"kubernetes.io/projected/d94f7b13-7594-474a-a4d5-fc1f6d448d66-kube-api-access-vcndb\") pod \"dns-default-jhdrk\" (UID: \"d94f7b13-7594-474a-a4d5-fc1f6d448d66\") " pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.674148 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.674124 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:51.815472 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.815445 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5xgdl"] Apr 16 19:54:51.817437 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:51.817403 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfded5762_e4ff_4f63_94bd_04c5209ebead.slice/crio-91188892a7e463d18640a91e4cb41f5c7872419d6443a2dc554398347a422952 WatchSource:0}: Error finding container 91188892a7e463d18640a91e4cb41f5c7872419d6443a2dc554398347a422952: Status 404 returned error can't find the container with id 91188892a7e463d18640a91e4cb41f5c7872419d6443a2dc554398347a422952 Apr 16 19:54:51.817862 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.817840 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qsrkv"] Apr 16 19:54:51.821342 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:51.821318 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1050de2f_179f_4060_953f_7c1c76584100.slice/crio-02cc75fa2dd1a7545320b884662d4b3883ebe7bb12afd07df097c1592c9c7d71 WatchSource:0}: Error finding container 02cc75fa2dd1a7545320b884662d4b3883ebe7bb12afd07df097c1592c9c7d71: Status 404 returned error can't find the container with id 02cc75fa2dd1a7545320b884662d4b3883ebe7bb12afd07df097c1592c9c7d71 Apr 16 19:54:51.823635 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.823614 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm"] Apr 16 19:54:51.830440 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.830419 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm" Apr 16 19:54:51.834273 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.834202 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 19:54:51.834423 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.834307 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-qc7wp\"" Apr 16 19:54:51.837601 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.837579 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jhdrk"] Apr 16 19:54:51.839474 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:51.839452 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd94f7b13_7594_474a_a4d5_fc1f6d448d66.slice/crio-4869e08efe3d46b22cca0c9f078bc4604eaf5e7acc6045a16e50c62933afdb28 WatchSource:0}: Error finding container 4869e08efe3d46b22cca0c9f078bc4604eaf5e7acc6045a16e50c62933afdb28: Status 404 returned error can't find the container with id 4869e08efe3d46b22cca0c9f078bc4604eaf5e7acc6045a16e50c62933afdb28 Apr 16 19:54:51.842886 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.842865 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm"] Apr 16 19:54:51.935424 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:51.935367 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/014c398b-8d9f-4b56-a942-ad2aadadf2f9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-b9ndm\" (UID: \"014c398b-8d9f-4b56-a942-ad2aadadf2f9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm" Apr 16 19:54:52.036267 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.036241 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/014c398b-8d9f-4b56-a942-ad2aadadf2f9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-b9ndm\" (UID: \"014c398b-8d9f-4b56-a942-ad2aadadf2f9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm" Apr 16 19:54:52.036362 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:52.036341 2568 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 19:54:52.036401 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:54:52.036386 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/014c398b-8d9f-4b56-a942-ad2aadadf2f9-tls-certificates podName:014c398b-8d9f-4b56-a942-ad2aadadf2f9 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:52.536371284 +0000 UTC m=+52.212373338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/014c398b-8d9f-4b56-a942-ad2aadadf2f9-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-b9ndm" (UID: "014c398b-8d9f-4b56-a942-ad2aadadf2f9") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 19:54:52.109237 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.109202 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jhdrk" event={"ID":"d94f7b13-7594-474a-a4d5-fc1f6d448d66","Type":"ContainerStarted","Data":"4869e08efe3d46b22cca0c9f078bc4604eaf5e7acc6045a16e50c62933afdb28"} Apr 16 19:54:52.110092 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.110067 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5xgdl" event={"ID":"fded5762-e4ff-4f63-94bd-04c5209ebead","Type":"ContainerStarted","Data":"91188892a7e463d18640a91e4cb41f5c7872419d6443a2dc554398347a422952"} Apr 16 19:54:52.111432 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.111410 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsrkv" event={"ID":"1050de2f-179f-4060-953f-7c1c76584100","Type":"ContainerStarted","Data":"e5e3e6507e7e3b9940a210dabda2082a5f002a43bdb8d9e2f863fa8f0c53e527"} Apr 16 19:54:52.111510 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.111437 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsrkv" event={"ID":"1050de2f-179f-4060-953f-7c1c76584100","Type":"ContainerStarted","Data":"02cc75fa2dd1a7545320b884662d4b3883ebe7bb12afd07df097c1592c9c7d71"} Apr 16 19:54:52.337583 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.337066 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-wf6ls"] Apr 16 19:54:52.370214 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.369966 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-wf6ls"] Apr 16 19:54:52.370214 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.370157 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-wf6ls" Apr 16 19:54:52.375934 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.375910 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 19:54:52.376349 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.376330 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-9s22n\"" Apr 16 19:54:52.376440 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.376353 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 19:54:52.539976 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.539770 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/014c398b-8d9f-4b56-a942-ad2aadadf2f9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-b9ndm\" (UID: \"014c398b-8d9f-4b56-a942-ad2aadadf2f9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm" Apr 16 19:54:52.539976 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.539881 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-757ms\" (UniqueName: \"kubernetes.io/projected/96e5dd67-29ce-447d-b662-38afb458d283-kube-api-access-757ms\") pod \"downloads-6bcc868b7-wf6ls\" (UID: \"96e5dd67-29ce-447d-b662-38afb458d283\") " pod="openshift-console/downloads-6bcc868b7-wf6ls" Apr 16 19:54:52.550710 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.550681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/014c398b-8d9f-4b56-a942-ad2aadadf2f9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-b9ndm\" (UID: \"014c398b-8d9f-4b56-a942-ad2aadadf2f9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm" Apr 16 19:54:52.640979 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.640888 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-757ms\" (UniqueName: \"kubernetes.io/projected/96e5dd67-29ce-447d-b662-38afb458d283-kube-api-access-757ms\") pod \"downloads-6bcc868b7-wf6ls\" (UID: \"96e5dd67-29ce-447d-b662-38afb458d283\") " pod="openshift-console/downloads-6bcc868b7-wf6ls" Apr 16 19:54:52.653464 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.653434 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-757ms\" (UniqueName: \"kubernetes.io/projected/96e5dd67-29ce-447d-b662-38afb458d283-kube-api-access-757ms\") pod \"downloads-6bcc868b7-wf6ls\" (UID: \"96e5dd67-29ce-447d-b662-38afb458d283\") " pod="openshift-console/downloads-6bcc868b7-wf6ls" Apr 16 19:54:52.682987 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.682944 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-wf6ls" Apr 16 19:54:52.742295 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.742133 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm" Apr 16 19:54:52.838965 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.837394 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-wf6ls"] Apr 16 19:54:52.840230 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:52.840197 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96e5dd67_29ce_447d_b662_38afb458d283.slice/crio-16eec92d9e8e992e768d4f69f751c324a60c0b2b04e63753feaa4e90c3e0d294 WatchSource:0}: Error finding container 16eec92d9e8e992e768d4f69f751c324a60c0b2b04e63753feaa4e90c3e0d294: Status 404 returned error can't find the container with id 16eec92d9e8e992e768d4f69f751c324a60c0b2b04e63753feaa4e90c3e0d294 Apr 16 19:54:52.887866 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.887839 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm"] Apr 16 19:54:52.893185 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.893115 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:54:52.893185 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.893159 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:54:52.896002 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.895984 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:52.896002 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.895998 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:54:52.896162 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.896023 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t9ktw\"" Apr 16 19:54:52.896162 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.896034 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:54:52.897161 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:52.897146 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kmsjn\"" Apr 16 19:54:53.060717 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:54:53.060692 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod014c398b_8d9f_4b56_a942_ad2aadadf2f9.slice/crio-71c9713d9dffad8f56a4f5aa517a54d9448e6b70cc7d0ac20bd16faa24c449c5 WatchSource:0}: Error finding container 71c9713d9dffad8f56a4f5aa517a54d9448e6b70cc7d0ac20bd16faa24c449c5: Status 404 returned error can't find the container with id 71c9713d9dffad8f56a4f5aa517a54d9448e6b70cc7d0ac20bd16faa24c449c5 Apr 16 19:54:53.114382 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:53.114342 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-wf6ls" event={"ID":"96e5dd67-29ce-447d-b662-38afb458d283","Type":"ContainerStarted","Data":"16eec92d9e8e992e768d4f69f751c324a60c0b2b04e63753feaa4e90c3e0d294"} Apr 16 19:54:53.115508 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:53.115473 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm" event={"ID":"014c398b-8d9f-4b56-a942-ad2aadadf2f9","Type":"ContainerStarted","Data":"71c9713d9dffad8f56a4f5aa517a54d9448e6b70cc7d0ac20bd16faa24c449c5"} Apr 16 19:54:55.129487 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:55.124875 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jhdrk" event={"ID":"d94f7b13-7594-474a-a4d5-fc1f6d448d66","Type":"ContainerStarted","Data":"c28735c65c33c12431e53fe6209ea65a041ad1d3c342d1e907daf9a5d729bb82"} Apr 16 19:54:55.129487 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:55.126741 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5xgdl" event={"ID":"fded5762-e4ff-4f63-94bd-04c5209ebead","Type":"ContainerStarted","Data":"2e6dcfa8347a41e1bd1e4c43427cfb6c115f4cbbb26a358515bae347941bb33c"} Apr 16 19:54:55.129487 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:55.129226 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsrkv" event={"ID":"1050de2f-179f-4060-953f-7c1c76584100","Type":"ContainerStarted","Data":"e3c683b584dc7cfc3a52ebc3312e5f07684d4d3e1fa27586d6d3065e3c35e5b9"} Apr 16 19:54:56.135293 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.135256 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jhdrk" event={"ID":"d94f7b13-7594-474a-a4d5-fc1f6d448d66","Type":"ContainerStarted","Data":"8eac0b95a7024ef7e3d7a5ca21ae09eefef1240c078c0447b9c0b764244aa001"} Apr 16 19:54:56.135700 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.135399 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jhdrk" Apr 16 19:54:56.136946 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.136920 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm" event={"ID":"014c398b-8d9f-4b56-a942-ad2aadadf2f9","Type":"ContainerStarted","Data":"4647f5e57688d5c7bbe5a978eaf08a018a0bdbaec9384c9d9fd309842e29cdb3"} Apr 16 19:54:56.137132 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.137116 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm" Apr 16 19:54:56.143124 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.143101 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm" Apr 16 19:54:56.154544 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.154048 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5xgdl" podStartSLOduration=2.146498332 podStartE2EDuration="5.15400182s" podCreationTimestamp="2026-04-16 19:54:51 +0000 UTC" firstStartedPulling="2026-04-16 19:54:51.819365083 +0000 UTC m=+51.495367137" lastFinishedPulling="2026-04-16 19:54:54.826868548 +0000 UTC m=+54.502870625" observedRunningTime="2026-04-16 19:54:55.15036262 +0000 UTC m=+54.826364695" watchObservedRunningTime="2026-04-16 19:54:56.15400182 +0000 UTC m=+55.830003898" Apr 16 19:54:56.155373 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.155144 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jhdrk" podStartSLOduration=2.169979772 podStartE2EDuration="5.155133639s" podCreationTimestamp="2026-04-16 19:54:51 +0000 UTC" firstStartedPulling="2026-04-16 19:54:51.841244547 +0000 UTC m=+51.517246616" lastFinishedPulling="2026-04-16 19:54:54.826398422 +0000 UTC m=+54.502400483" observedRunningTime="2026-04-16 19:54:56.152325185 +0000 UTC m=+55.828327266" watchObservedRunningTime="2026-04-16 19:54:56.155133639 +0000 UTC m=+55.831135719" Apr 16 19:54:56.167067 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.167023 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9ndm" podStartSLOduration=3.138339633 podStartE2EDuration="5.167010265s" podCreationTimestamp="2026-04-16 19:54:51 +0000 UTC" firstStartedPulling="2026-04-16 19:54:53.06251585 +0000 UTC m=+52.738517906" lastFinishedPulling="2026-04-16 19:54:55.09118647 +0000 UTC m=+54.767188538" observedRunningTime="2026-04-16 19:54:56.166632236 +0000 UTC m=+55.842634400" watchObservedRunningTime="2026-04-16 19:54:56.167010265 +0000 UTC m=+55.843012342" Apr 16 19:54:56.856423 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.856393 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gqkzx"] Apr 16 19:54:56.888399 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.888372 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gqkzx"] Apr 16 19:54:56.888563 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.888539 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:56.891480 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.891455 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 19:54:56.891480 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.891467 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:54:56.891636 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.891462 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:54:56.891636 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.891549 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 19:54:56.892841 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.892771 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:54:56.892954 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.892860 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-hsvf6\"" Apr 16 19:54:56.973925 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.973897 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccb4b340-2600-42d0-af5c-929cb99cf57c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gqkzx\" (UID: \"ccb4b340-2600-42d0-af5c-929cb99cf57c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:56.974077 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.973933 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7zf4\" (UniqueName: \"kubernetes.io/projected/ccb4b340-2600-42d0-af5c-929cb99cf57c-kube-api-access-j7zf4\") pod \"prometheus-operator-5676c8c784-gqkzx\" (UID: \"ccb4b340-2600-42d0-af5c-929cb99cf57c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:56.974077 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.973963 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ccb4b340-2600-42d0-af5c-929cb99cf57c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gqkzx\" (UID: \"ccb4b340-2600-42d0-af5c-929cb99cf57c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:56.974077 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:56.974032 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ccb4b340-2600-42d0-af5c-929cb99cf57c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gqkzx\" (UID: \"ccb4b340-2600-42d0-af5c-929cb99cf57c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:57.074462 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:57.074396 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccb4b340-2600-42d0-af5c-929cb99cf57c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gqkzx\" (UID: \"ccb4b340-2600-42d0-af5c-929cb99cf57c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:57.074462 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:57.074439 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7zf4\" (UniqueName: \"kubernetes.io/projected/ccb4b340-2600-42d0-af5c-929cb99cf57c-kube-api-access-j7zf4\") pod \"prometheus-operator-5676c8c784-gqkzx\" (UID: \"ccb4b340-2600-42d0-af5c-929cb99cf57c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:57.074649 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:57.074467 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ccb4b340-2600-42d0-af5c-929cb99cf57c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gqkzx\" (UID: \"ccb4b340-2600-42d0-af5c-929cb99cf57c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:57.074649 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:57.074493 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ccb4b340-2600-42d0-af5c-929cb99cf57c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gqkzx\" (UID: \"ccb4b340-2600-42d0-af5c-929cb99cf57c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:57.075252 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:57.075228 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccb4b340-2600-42d0-af5c-929cb99cf57c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gqkzx\" (UID: \"ccb4b340-2600-42d0-af5c-929cb99cf57c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:57.077200 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:57.077158 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ccb4b340-2600-42d0-af5c-929cb99cf57c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gqkzx\" (UID: \"ccb4b340-2600-42d0-af5c-929cb99cf57c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:57.077330 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:57.077236 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ccb4b340-2600-42d0-af5c-929cb99cf57c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gqkzx\" (UID: \"ccb4b340-2600-42d0-af5c-929cb99cf57c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:57.086858 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:57.086839 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7zf4\" (UniqueName: \"kubernetes.io/projected/ccb4b340-2600-42d0-af5c-929cb99cf57c-kube-api-access-j7zf4\") pod \"prometheus-operator-5676c8c784-gqkzx\" (UID: \"ccb4b340-2600-42d0-af5c-929cb99cf57c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:57.144960 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:57.144921 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsrkv" event={"ID":"1050de2f-179f-4060-953f-7c1c76584100","Type":"ContainerStarted","Data":"f70925402f73abd67b00cc8bf28a1d52a175688d578ad30a97291c70bc863556"} Apr 16 19:54:57.168665 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:57.168621 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qsrkv" podStartSLOduration=1.288046799 podStartE2EDuration="6.168608231s" podCreationTimestamp="2026-04-16 19:54:51 +0000 UTC" firstStartedPulling="2026-04-16 19:54:51.911001694 +0000 UTC m=+51.587003749" lastFinishedPulling="2026-04-16 19:54:56.791563101 +0000 UTC m=+56.467565181" observedRunningTime="2026-04-16 19:54:57.167692673 +0000 UTC m=+56.843694749" watchObservedRunningTime="2026-04-16 19:54:57.168608231 +0000 UTC m=+56.844610343" Apr 16 19:54:57.199262 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:57.199242 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" Apr 16 19:54:57.329068 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:57.328988 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gqkzx"] Apr 16 19:54:58.067338 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:58.067308 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cspns" Apr 16 19:54:58.149536 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:58.149493 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" event={"ID":"ccb4b340-2600-42d0-af5c-929cb99cf57c","Type":"ContainerStarted","Data":"33785dae0502547085a5c46651effed69588eda4f4b98a23da0c542e00620030"} Apr 16 19:54:59.153949 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:59.153912 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" event={"ID":"ccb4b340-2600-42d0-af5c-929cb99cf57c","Type":"ContainerStarted","Data":"aa431dcca3b65c48cae4f65a1a2a594755e6bbb912b4b38b0d3d9d403c41169a"} Apr 16 19:54:59.153949 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:59.153953 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" event={"ID":"ccb4b340-2600-42d0-af5c-929cb99cf57c","Type":"ContainerStarted","Data":"4bb44c76a514863f0adf5664bdd27e2ed0e8c8bc899add65952ede5aa0c576b8"} Apr 16 19:54:59.182148 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:54:59.182073 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-gqkzx" podStartSLOduration=1.789451624 podStartE2EDuration="3.182056951s" podCreationTimestamp="2026-04-16 19:54:56 +0000 UTC" firstStartedPulling="2026-04-16 19:54:57.335881026 +0000 UTC m=+57.011883084" lastFinishedPulling="2026-04-16 19:54:58.728486351 +0000 UTC m=+58.404488411" observedRunningTime="2026-04-16 19:54:59.180407859 +0000 UTC m=+58.856409935" watchObservedRunningTime="2026-04-16 19:54:59.182056951 +0000 UTC m=+58.858059029" Apr 16 19:55:01.329446 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.329411 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-97w44"] Apr 16 19:55:01.333718 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.333690 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.336551 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.336526 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:55:01.336679 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.336622 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:55:01.336775 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.336757 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6xp24\"" Apr 16 19:55:01.336842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.336781 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:55:01.398092 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.398070 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-z6zfs"] Apr 16 19:55:01.401617 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.401598 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.405390 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.405370 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-x49wc\"" Apr 16 19:55:01.405582 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.405553 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 19:55:01.406045 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.406025 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 19:55:01.406150 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.406068 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 19:55:01.408793 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.408762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2549d95-7be7-41cf-859c-0af719d66591-sys\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.408910 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.408823 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj2td\" (UniqueName: \"kubernetes.io/projected/e2549d95-7be7-41cf-859c-0af719d66591-kube-api-access-zj2td\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.408910 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.408869 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.408910 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.408899 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e2549d95-7be7-41cf-859c-0af719d66591-root\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.409077 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.408928 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-tls\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.409077 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.408952 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2549d95-7be7-41cf-859c-0af719d66591-metrics-client-ca\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.409077 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.408992 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-textfile\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.409077 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.409044 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-accelerators-collector-config\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.409417 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.409123 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-wtmp\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.416830 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.416803 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-z6zfs"] Apr 16 19:55:01.509993 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.509964 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-tls\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.509993 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.509997 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2549d95-7be7-41cf-859c-0af719d66591-metrics-client-ca\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.510249 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510035 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/943336c5-9e13-4677-a528-a07e32a448ef-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.510249 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510059 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/943336c5-9e13-4677-a528-a07e32a448ef-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.510249 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510083 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-textfile\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.510249 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510114 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-accelerators-collector-config\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.510249 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:55:01.510160 2568 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 19:55:01.510249 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:55:01.510250 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-tls podName:e2549d95-7be7-41cf-859c-0af719d66591 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:02.010228745 +0000 UTC m=+61.686230804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-tls") pod "node-exporter-97w44" (UID: "e2549d95-7be7-41cf-859c-0af719d66591") : secret "node-exporter-tls" not found Apr 16 19:55:01.510533 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510334 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/943336c5-9e13-4677-a528-a07e32a448ef-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.510533 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510378 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/943336c5-9e13-4677-a528-a07e32a448ef-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.510533 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510429 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swtzw\" (UniqueName: \"kubernetes.io/projected/943336c5-9e13-4677-a528-a07e32a448ef-kube-api-access-swtzw\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.510533 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2549d95-7be7-41cf-859c-0af719d66591-sys\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.510712 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510564 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.510712 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510594 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e2549d95-7be7-41cf-859c-0af719d66591-root\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.510712 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510622 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2549d95-7be7-41cf-859c-0af719d66591-sys\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.510712 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510653 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-wtmp\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.510712 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510684 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/943336c5-9e13-4677-a528-a07e32a448ef-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.510922 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510707 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e2549d95-7be7-41cf-859c-0af719d66591-root\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.510922 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510720 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-accelerators-collector-config\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.510922 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510726 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zj2td\" (UniqueName: \"kubernetes.io/projected/e2549d95-7be7-41cf-859c-0af719d66591-kube-api-access-zj2td\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.510922 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510711 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2549d95-7be7-41cf-859c-0af719d66591-metrics-client-ca\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.510922 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510784 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-wtmp\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.510922 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.510863 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-textfile\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.513055 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.513034 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.522302 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.522275 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj2td\" (UniqueName: \"kubernetes.io/projected/e2549d95-7be7-41cf-859c-0af719d66591-kube-api-access-zj2td\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:01.612076 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.611996 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/943336c5-9e13-4677-a528-a07e32a448ef-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.612076 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.612064 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/943336c5-9e13-4677-a528-a07e32a448ef-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.612277 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.612098 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/943336c5-9e13-4677-a528-a07e32a448ef-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.612277 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.612124 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/943336c5-9e13-4677-a528-a07e32a448ef-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.612277 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.612148 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/943336c5-9e13-4677-a528-a07e32a448ef-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.612277 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.612193 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swtzw\" (UniqueName: \"kubernetes.io/projected/943336c5-9e13-4677-a528-a07e32a448ef-kube-api-access-swtzw\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.612895 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.612864 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/943336c5-9e13-4677-a528-a07e32a448ef-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.613053 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.613035 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/943336c5-9e13-4677-a528-a07e32a448ef-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.613663 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.613621 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/943336c5-9e13-4677-a528-a07e32a448ef-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.615316 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.615295 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/943336c5-9e13-4677-a528-a07e32a448ef-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.615418 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.615370 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/943336c5-9e13-4677-a528-a07e32a448ef-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.621679 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.621639 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swtzw\" (UniqueName: \"kubernetes.io/projected/943336c5-9e13-4677-a528-a07e32a448ef-kube-api-access-swtzw\") pod \"kube-state-metrics-69db897b98-z6zfs\" (UID: \"943336c5-9e13-4677-a528-a07e32a448ef\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.711653 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.711622 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" Apr 16 19:55:01.843931 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:01.843862 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-z6zfs"] Apr 16 19:55:01.846889 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:55:01.846857 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod943336c5_9e13_4677_a528_a07e32a448ef.slice/crio-cff48a3fe64d31845b23defc0c79a260bfea2bf71cfaa1f80ac02c72b52a12ba WatchSource:0}: Error finding container cff48a3fe64d31845b23defc0c79a260bfea2bf71cfaa1f80ac02c72b52a12ba: Status 404 returned error can't find the container with id cff48a3fe64d31845b23defc0c79a260bfea2bf71cfaa1f80ac02c72b52a12ba Apr 16 19:55:02.017224 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.017196 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-tls\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:02.019650 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.019623 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e2549d95-7be7-41cf-859c-0af719d66591-node-exporter-tls\") pod \"node-exporter-97w44\" (UID: \"e2549d95-7be7-41cf-859c-0af719d66591\") " pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:02.163536 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.163496 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" event={"ID":"943336c5-9e13-4677-a528-a07e32a448ef","Type":"ContainerStarted","Data":"cff48a3fe64d31845b23defc0c79a260bfea2bf71cfaa1f80ac02c72b52a12ba"} Apr 16 19:55:02.244467 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.244424 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-97w44" Apr 16 19:55:02.253867 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:55:02.253837 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2549d95_7be7_41cf_859c_0af719d66591.slice/crio-0cc5d8c0f79d77dff6663fabe68a1d204a565aa324d2ef27aec199d9dd890cb8 WatchSource:0}: Error finding container 0cc5d8c0f79d77dff6663fabe68a1d204a565aa324d2ef27aec199d9dd890cb8: Status 404 returned error can't find the container with id 0cc5d8c0f79d77dff6663fabe68a1d204a565aa324d2ef27aec199d9dd890cb8 Apr 16 19:55:02.384119 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.384039 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:55:02.389324 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.389299 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.395753 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.395729 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 19:55:02.395875 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.395832 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 19:55:02.395937 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.395883 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 19:55:02.395937 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.395839 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 19:55:02.396094 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.396074 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 19:55:02.396186 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.396133 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 19:55:02.396641 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.396304 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 19:55:02.396641 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.396370 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-p6pfn\"" Apr 16 19:55:02.396641 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.396441 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 19:55:02.396641 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.396454 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 19:55:02.407078 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.407054 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:55:02.420491 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.420466 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.420625 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.420511 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.420625 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.420540 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8413c40-c771-4971-b93c-eee3e92225b3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.420743 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.420657 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.420743 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.420703 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.420843 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.420771 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.420843 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.420818 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tz9s\" (UniqueName: \"kubernetes.io/projected/b8413c40-c771-4971-b93c-eee3e92225b3-kube-api-access-9tz9s\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.420921 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.420847 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-config-volume\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.420921 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.420901 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.421007 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.420940 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.421007 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.420991 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8413c40-c771-4971-b93c-eee3e92225b3-config-out\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.421227 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.421034 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.421227 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.421057 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-web-config\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.522200 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.521877 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.522200 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.521930 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-web-config\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.522200 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.521966 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.522200 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.521996 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.522200 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.522034 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8413c40-c771-4971-b93c-eee3e92225b3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.522200 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.522093 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.522200 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.522126 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.522662 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.522408 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.523203 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.522164 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.523203 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.522789 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tz9s\" (UniqueName: \"kubernetes.io/projected/b8413c40-c771-4971-b93c-eee3e92225b3-kube-api-access-9tz9s\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.523203 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.522819 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-config-volume\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.523203 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.522847 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.523203 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.522873 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.523203 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.522941 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8413c40-c771-4971-b93c-eee3e92225b3-config-out\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.523569 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:55:02.523485 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-trusted-ca-bundle podName:b8413c40-c771-4971-b93c-eee3e92225b3 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:03.023461317 +0000 UTC m=+62.699463371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3") : configmap references non-existent config key: ca-bundle.crt Apr 16 19:55:02.525134 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.525109 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.525791 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.525767 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8413c40-c771-4971-b93c-eee3e92225b3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.526849 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.526783 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.528198 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.528139 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.529039 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.529012 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.530871 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.530402 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-config-volume\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.530871 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.530511 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-web-config\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.530871 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.530709 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.530871 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.530836 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.531318 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.531297 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8413c40-c771-4971-b93c-eee3e92225b3-config-out\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:02.535196 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:02.535149 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tz9s\" (UniqueName: \"kubernetes.io/projected/b8413c40-c771-4971-b93c-eee3e92225b3-kube-api-access-9tz9s\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:03.028143 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:03.028106 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:03.028891 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:03.028870 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:03.168822 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:03.168786 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-97w44" event={"ID":"e2549d95-7be7-41cf-859c-0af719d66591","Type":"ContainerStarted","Data":"0cc5d8c0f79d77dff6663fabe68a1d204a565aa324d2ef27aec199d9dd890cb8"} Apr 16 19:55:03.304331 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:03.304255 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:55:05.768103 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.768066 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7878fffd95-cgn9r"] Apr 16 19:55:05.773219 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.773195 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.777612 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.777591 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 19:55:05.777612 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.777603 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:55:05.777825 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.777625 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 19:55:05.777825 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.777712 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-ncljg\"" Apr 16 19:55:05.778065 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.777964 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 19:55:05.778065 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.778035 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-e6hdi901mejfn\"" Apr 16 19:55:05.792079 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.792058 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7878fffd95-cgn9r"] Apr 16 19:55:05.854678 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.854649 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ebba194-30e5-4b1f-bdee-29507d5ed72d-client-ca-bundle\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.854837 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.854698 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6ebba194-30e5-4b1f-bdee-29507d5ed72d-metrics-server-audit-profiles\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.854837 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.854816 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6ebba194-30e5-4b1f-bdee-29507d5ed72d-secret-metrics-server-client-certs\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.854972 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.854848 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6ebba194-30e5-4b1f-bdee-29507d5ed72d-audit-log\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.854972 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.854880 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ebba194-30e5-4b1f-bdee-29507d5ed72d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.854972 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.854931 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6ebba194-30e5-4b1f-bdee-29507d5ed72d-secret-metrics-server-tls\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.855116 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.854973 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t625l\" (UniqueName: \"kubernetes.io/projected/6ebba194-30e5-4b1f-bdee-29507d5ed72d-kube-api-access-t625l\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.955806 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.955767 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t625l\" (UniqueName: \"kubernetes.io/projected/6ebba194-30e5-4b1f-bdee-29507d5ed72d-kube-api-access-t625l\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.955989 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.955937 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ebba194-30e5-4b1f-bdee-29507d5ed72d-client-ca-bundle\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.956055 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.955986 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6ebba194-30e5-4b1f-bdee-29507d5ed72d-metrics-server-audit-profiles\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.956103 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.956065 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6ebba194-30e5-4b1f-bdee-29507d5ed72d-secret-metrics-server-client-certs\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.956150 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.956089 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6ebba194-30e5-4b1f-bdee-29507d5ed72d-audit-log\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.956272 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.956159 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ebba194-30e5-4b1f-bdee-29507d5ed72d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.956490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.956356 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6ebba194-30e5-4b1f-bdee-29507d5ed72d-secret-metrics-server-tls\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.956774 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.956631 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6ebba194-30e5-4b1f-bdee-29507d5ed72d-audit-log\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.957137 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.957112 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ebba194-30e5-4b1f-bdee-29507d5ed72d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.957254 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.957188 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6ebba194-30e5-4b1f-bdee-29507d5ed72d-metrics-server-audit-profiles\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.959034 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.959008 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6ebba194-30e5-4b1f-bdee-29507d5ed72d-secret-metrics-server-tls\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.959131 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.959101 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ebba194-30e5-4b1f-bdee-29507d5ed72d-client-ca-bundle\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.959259 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.959241 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6ebba194-30e5-4b1f-bdee-29507d5ed72d-secret-metrics-server-client-certs\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:05.968703 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:05.968678 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t625l\" (UniqueName: \"kubernetes.io/projected/6ebba194-30e5-4b1f-bdee-29507d5ed72d-kube-api-access-t625l\") pod \"metrics-server-7878fffd95-cgn9r\" (UID: \"6ebba194-30e5-4b1f-bdee-29507d5ed72d\") " pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:06.033620 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.033553 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5"] Apr 16 19:55:06.038312 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.038288 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5" Apr 16 19:55:06.041575 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.041552 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 19:55:06.041694 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.041563 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-hjnpl\"" Apr 16 19:55:06.049094 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.049056 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5"] Apr 16 19:55:06.084868 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.084837 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:06.147596 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.147563 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jhdrk" Apr 16 19:55:06.158008 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.157978 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0c00b183-44d6-4f42-be34-b7f63056fa91-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kfvs5\" (UID: \"0c00b183-44d6-4f42-be34-b7f63056fa91\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5" Apr 16 19:55:06.259324 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.259286 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0c00b183-44d6-4f42-be34-b7f63056fa91-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kfvs5\" (UID: \"0c00b183-44d6-4f42-be34-b7f63056fa91\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5" Apr 16 19:55:06.259505 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:55:06.259452 2568 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 19:55:06.259568 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:55:06.259535 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c00b183-44d6-4f42-be34-b7f63056fa91-monitoring-plugin-cert podName:0c00b183-44d6-4f42-be34-b7f63056fa91 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:06.759516076 +0000 UTC m=+66.435518129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/0c00b183-44d6-4f42-be34-b7f63056fa91-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-kfvs5" (UID: "0c00b183-44d6-4f42-be34-b7f63056fa91") : secret "monitoring-plugin-cert" not found Apr 16 19:55:06.581753 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.581719 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv"] Apr 16 19:55:06.585364 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.585336 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.588662 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.588516 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 19:55:06.588662 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.588582 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-p4bg6\"" Apr 16 19:55:06.588662 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.588590 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 19:55:06.588870 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.588693 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 19:55:06.588870 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.588832 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 19:55:06.588961 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.588944 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 19:55:06.595029 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.595003 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 19:55:06.597686 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.597662 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv"] Apr 16 19:55:06.663161 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.663126 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-secret-telemeter-client\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.663326 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.663237 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-serving-certs-ca-bundle\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.663402 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.663325 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.663402 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.663385 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs\") pod \"network-metrics-daemon-v62bb\" (UID: \"12ed67c2-088e-47ad-b2f4-d5da475ea9fc\") " pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:55:06.663513 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.663429 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-metrics-client-ca\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.663513 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.663457 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.663611 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.663524 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-federate-client-tls\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.663611 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.663583 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-telemeter-client-tls\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.663688 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.663609 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bwks\" (UniqueName: \"kubernetes.io/projected/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-kube-api-access-9bwks\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.666470 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.666446 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:55:06.676099 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.676076 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12ed67c2-088e-47ad-b2f4-d5da475ea9fc-metrics-certs\") pod \"network-metrics-daemon-v62bb\" (UID: \"12ed67c2-088e-47ad-b2f4-d5da475ea9fc\") " pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:55:06.708129 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.708105 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t9ktw\"" Apr 16 19:55:06.715618 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.715597 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v62bb" Apr 16 19:55:06.764965 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.764927 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-serving-certs-ca-bundle\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.765091 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.764992 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0c00b183-44d6-4f42-be34-b7f63056fa91-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kfvs5\" (UID: \"0c00b183-44d6-4f42-be34-b7f63056fa91\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5" Apr 16 19:55:06.765091 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.765035 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.765253 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.765091 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-metrics-client-ca\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.765253 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.765116 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.765253 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.765148 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-federate-client-tls\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.765253 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.765204 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-telemeter-client-tls\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.765253 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.765224 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bwks\" (UniqueName: \"kubernetes.io/projected/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-kube-api-access-9bwks\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.765490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.765246 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv\") pod \"network-check-target-dpc5h\" (UID: \"fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4\") " pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:55:06.765490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.765287 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-secret-telemeter-client\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.765793 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.765760 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-serving-certs-ca-bundle\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.767231 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.767163 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.768349 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.768314 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-metrics-client-ca\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.768832 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.768812 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-federate-client-tls\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.769339 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.769319 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-secret-telemeter-client\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.769838 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.769621 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:55:06.769946 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.769914 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.770046 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.770009 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0c00b183-44d6-4f42-be34-b7f63056fa91-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kfvs5\" (UID: \"0c00b183-44d6-4f42-be34-b7f63056fa91\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5" Apr 16 19:55:06.773468 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.773448 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-telemeter-client-tls\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.776336 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.776315 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bwks\" (UniqueName: \"kubernetes.io/projected/824d29df-c9d9-42ca-b9b1-94f18a2e17ee-kube-api-access-9bwks\") pod \"telemeter-client-5b7cd7d77f-2jlnv\" (UID: \"824d29df-c9d9-42ca-b9b1-94f18a2e17ee\") " pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.778341 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.778304 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:55:06.788998 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.788979 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq9qv\" (UniqueName: \"kubernetes.io/projected/fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4-kube-api-access-hq9qv\") pod \"network-check-target-dpc5h\" (UID: \"fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4\") " pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:55:06.895980 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.895898 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" Apr 16 19:55:06.950825 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:06.950789 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5" Apr 16 19:55:07.013816 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:07.013788 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kmsjn\"" Apr 16 19:55:07.021409 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:07.021384 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:55:10.389775 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.389736 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv"] Apr 16 19:55:10.465146 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.465118 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7878fffd95-cgn9r"] Apr 16 19:55:10.629734 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.629698 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5"] Apr 16 19:55:10.631870 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.631846 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d6ddbc884-hfkkp"] Apr 16 19:55:10.638756 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:55:10.638724 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod824d29df_c9d9_42ca_b9b1_94f18a2e17ee.slice/crio-57c021d19486fa9fa795a6040e48460d461a4e99989183346df075ab9a302774 WatchSource:0}: Error finding container 57c021d19486fa9fa795a6040e48460d461a4e99989183346df075ab9a302774: Status 404 returned error can't find the container with id 57c021d19486fa9fa795a6040e48460d461a4e99989183346df075ab9a302774 Apr 16 19:55:10.639772 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:55:10.639735 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ebba194_30e5_4b1f_bdee_29507d5ed72d.slice/crio-e27e8e640a4673aa0cd15e04f83092da98ca92839632f208a645c3ee33034587 WatchSource:0}: Error finding container e27e8e640a4673aa0cd15e04f83092da98ca92839632f208a645c3ee33034587: Status 404 returned error can't find the container with id e27e8e640a4673aa0cd15e04f83092da98ca92839632f208a645c3ee33034587 Apr 16 19:55:10.641395 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:55:10.641374 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c00b183_44d6_4f42_be34_b7f63056fa91.slice/crio-17ce16be5768802d7f3d871176c343c59cab6a6779a4d65b0780ccb3c63c83a0 WatchSource:0}: Error finding container 17ce16be5768802d7f3d871176c343c59cab6a6779a4d65b0780ccb3c63c83a0: Status 404 returned error can't find the container with id 17ce16be5768802d7f3d871176c343c59cab6a6779a4d65b0780ccb3c63c83a0 Apr 16 19:55:10.647208 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:55:10.647144 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12ed67c2_088e_47ad_b2f4_d5da475ea9fc.slice/crio-6d8d6561fd3d68ef4e1c4845ba1b3ab46efcd2f733a63d3c6f1e17b852b6e751 WatchSource:0}: Error finding container 6d8d6561fd3d68ef4e1c4845ba1b3ab46efcd2f733a63d3c6f1e17b852b6e751: Status 404 returned error can't find the container with id 6d8d6561fd3d68ef4e1c4845ba1b3ab46efcd2f733a63d3c6f1e17b852b6e751 Apr 16 19:55:10.647891 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:55:10.647862 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe9d3b1f_3e6a_4f4c_86f4_1a3f245813b4.slice/crio-eeb3047af05f1e4f9f0dd66e375d97cd7370b13db107e41781b834316c9e2fc1 WatchSource:0}: Error finding container eeb3047af05f1e4f9f0dd66e375d97cd7370b13db107e41781b834316c9e2fc1: Status 404 returned error can't find the container with id eeb3047af05f1e4f9f0dd66e375d97cd7370b13db107e41781b834316c9e2fc1 Apr 16 19:55:10.654689 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.654668 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v62bb"] Apr 16 19:55:10.654779 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.654698 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dpc5h"] Apr 16 19:55:10.654779 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.654712 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:55:10.654779 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.654726 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d6ddbc884-hfkkp"] Apr 16 19:55:10.654923 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.654846 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.661932 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.661899 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 19:55:10.662234 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.661901 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 19:55:10.662863 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.662554 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 19:55:10.662863 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.662569 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7f55r\"" Apr 16 19:55:10.662863 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.662710 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 19:55:10.662863 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.662761 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 19:55:10.668614 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.668545 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 19:55:10.698973 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.698829 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-oauth-config\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.698973 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.698864 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-service-ca\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.698973 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.698903 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-oauth-serving-cert\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.699199 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.699012 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-trusted-ca-bundle\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.699199 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.699048 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-serving-cert\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.699199 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.699098 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-config\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.699199 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.699127 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ccb\" (UniqueName: \"kubernetes.io/projected/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-kube-api-access-v8ccb\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.799697 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.799661 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-oauth-config\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.799885 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.799866 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-service-ca\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.800221 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.800202 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-oauth-serving-cert\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.800456 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.800439 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-trusted-ca-bundle\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.800683 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.800663 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-serving-cert\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.800809 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.800787 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-service-ca\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.800886 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.800799 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-config\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.801003 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.800987 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ccb\" (UniqueName: \"kubernetes.io/projected/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-kube-api-access-v8ccb\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.801443 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.801405 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-trusted-ca-bundle\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.801833 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.801715 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-config\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.802102 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.802067 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-oauth-serving-cert\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.802677 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.802652 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-oauth-config\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.803737 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.803711 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-serving-cert\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.813924 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.813906 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ccb\" (UniqueName: \"kubernetes.io/projected/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-kube-api-access-v8ccb\") pod \"console-7d6ddbc884-hfkkp\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:10.969604 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:10.969528 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:11.201446 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.201356 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" event={"ID":"943336c5-9e13-4677-a528-a07e32a448ef","Type":"ContainerStarted","Data":"9b6c30b7de062a66659b92138eea667529e7efba1ee35e67471b8cc89ab642e4"} Apr 16 19:55:11.201446 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.201404 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" event={"ID":"943336c5-9e13-4677-a528-a07e32a448ef","Type":"ContainerStarted","Data":"8f3395bb43c08c20efebc16e8b602726932133caa55e8687175d3e9288ed46d9"} Apr 16 19:55:11.201446 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.201421 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" event={"ID":"943336c5-9e13-4677-a528-a07e32a448ef","Type":"ContainerStarted","Data":"c5fa29a5e6395949e735b266653c006979f698077e57fce8599ba8ee27071de9"} Apr 16 19:55:11.204137 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.204077 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dpc5h" event={"ID":"fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4","Type":"ContainerStarted","Data":"eeb3047af05f1e4f9f0dd66e375d97cd7370b13db107e41781b834316c9e2fc1"} Apr 16 19:55:11.207772 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.207730 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-wf6ls" event={"ID":"96e5dd67-29ce-447d-b662-38afb458d283","Type":"ContainerStarted","Data":"14581dbc452ace29c0392533bbb3eafda3cc19a6ca6a9bd8f102a94bf943f673"} Apr 16 19:55:11.208480 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.208443 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-wf6ls" Apr 16 19:55:11.209346 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.209294 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerStarted","Data":"a4a33d953450e55a7d39d7e3bf8ea51347734ce44d5ec6c38a77b0b3e797a8bb"} Apr 16 19:55:11.211234 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.211207 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v62bb" event={"ID":"12ed67c2-088e-47ad-b2f4-d5da475ea9fc","Type":"ContainerStarted","Data":"6d8d6561fd3d68ef4e1c4845ba1b3ab46efcd2f733a63d3c6f1e17b852b6e751"} Apr 16 19:55:11.212822 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.212754 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" event={"ID":"6ebba194-30e5-4b1f-bdee-29507d5ed72d","Type":"ContainerStarted","Data":"e27e8e640a4673aa0cd15e04f83092da98ca92839632f208a645c3ee33034587"} Apr 16 19:55:11.214493 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.214470 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5" event={"ID":"0c00b183-44d6-4f42-be34-b7f63056fa91","Type":"ContainerStarted","Data":"17ce16be5768802d7f3d871176c343c59cab6a6779a4d65b0780ccb3c63c83a0"} Apr 16 19:55:11.215703 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.215678 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" event={"ID":"824d29df-c9d9-42ca-b9b1-94f18a2e17ee","Type":"ContainerStarted","Data":"57c021d19486fa9fa795a6040e48460d461a4e99989183346df075ab9a302774"} Apr 16 19:55:11.225056 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.224986 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-wf6ls" Apr 16 19:55:11.239257 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.238533 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-z6zfs" podStartSLOduration=1.896506896 podStartE2EDuration="10.238516598s" podCreationTimestamp="2026-04-16 19:55:01 +0000 UTC" firstStartedPulling="2026-04-16 19:55:01.849029002 +0000 UTC m=+61.525031057" lastFinishedPulling="2026-04-16 19:55:10.191038704 +0000 UTC m=+69.867040759" observedRunningTime="2026-04-16 19:55:11.234711767 +0000 UTC m=+70.910713865" watchObservedRunningTime="2026-04-16 19:55:11.238516598 +0000 UTC m=+70.914518675" Apr 16 19:55:11.241018 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.240999 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d6ddbc884-hfkkp"] Apr 16 19:55:11.267321 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:55:11.267288 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe03b8be_957c_4a17_9b1a_6605af5d7e6c.slice/crio-e5a773d2401f636a9d80f38c34cd829011277178e165631da4abdc1d6543a2e8 WatchSource:0}: Error finding container e5a773d2401f636a9d80f38c34cd829011277178e165631da4abdc1d6543a2e8: Status 404 returned error can't find the container with id e5a773d2401f636a9d80f38c34cd829011277178e165631da4abdc1d6543a2e8 Apr 16 19:55:11.272990 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:11.272934 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-wf6ls" podStartSLOduration=1.9232682840000002 podStartE2EDuration="19.272914262s" podCreationTimestamp="2026-04-16 19:54:52 +0000 UTC" firstStartedPulling="2026-04-16 19:54:52.842321258 +0000 UTC m=+52.518323315" lastFinishedPulling="2026-04-16 19:55:10.191967236 +0000 UTC m=+69.867969293" observedRunningTime="2026-04-16 19:55:11.269646777 +0000 UTC m=+70.945648853" watchObservedRunningTime="2026-04-16 19:55:11.272914262 +0000 UTC m=+70.948916336" Apr 16 19:55:12.227230 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:12.226257 2568 generic.go:358] "Generic (PLEG): container finished" podID="e2549d95-7be7-41cf-859c-0af719d66591" containerID="b2cfc4b4debfa624c6cdc01e5ec210ea6683dfddf6a46f28fc2a05fd0f8f521e" exitCode=0 Apr 16 19:55:12.227230 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:12.226352 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-97w44" event={"ID":"e2549d95-7be7-41cf-859c-0af719d66591","Type":"ContainerDied","Data":"b2cfc4b4debfa624c6cdc01e5ec210ea6683dfddf6a46f28fc2a05fd0f8f521e"} Apr 16 19:55:12.230205 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:12.229897 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d6ddbc884-hfkkp" event={"ID":"fe03b8be-957c-4a17-9b1a-6605af5d7e6c","Type":"ContainerStarted","Data":"e5a773d2401f636a9d80f38c34cd829011277178e165631da4abdc1d6543a2e8"} Apr 16 19:55:20.258138 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.258088 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dpc5h" event={"ID":"fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4","Type":"ContainerStarted","Data":"5e3c3227be2d68b2aebb39024a35f5d7e09614fe5b54ca69e2e61d19feaa778c"} Apr 16 19:55:20.258686 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.258369 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:55:20.260472 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.260445 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-97w44" event={"ID":"e2549d95-7be7-41cf-859c-0af719d66591","Type":"ContainerStarted","Data":"e8aa3f4b93f707ab6744b159900a850915979905d2b86bfb8e63d841bdff653a"} Apr 16 19:55:20.260598 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.260475 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-97w44" event={"ID":"e2549d95-7be7-41cf-859c-0af719d66591","Type":"ContainerStarted","Data":"c6c5259fde79bda8987de1592bbcb3ee481e5bb563db7f4e0917b503dd5a8416"} Apr 16 19:55:20.262156 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.262130 2568 generic.go:358] "Generic (PLEG): container finished" podID="b8413c40-c771-4971-b93c-eee3e92225b3" containerID="133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8" exitCode=0 Apr 16 19:55:20.262285 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.262211 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerDied","Data":"133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8"} Apr 16 19:55:20.266276 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.266244 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v62bb" event={"ID":"12ed67c2-088e-47ad-b2f4-d5da475ea9fc","Type":"ContainerStarted","Data":"c23cb6938e6e55d8fbc0069ff8acb0e1dd2589872a1e3a2c1c3ce26bcb72bacf"} Apr 16 19:55:20.266376 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.266276 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v62bb" event={"ID":"12ed67c2-088e-47ad-b2f4-d5da475ea9fc","Type":"ContainerStarted","Data":"21a050b3e143efa9f4e967a390e7da464e11588d5a31156b4b5f43b44909bf40"} Apr 16 19:55:20.267851 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.267826 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" event={"ID":"6ebba194-30e5-4b1f-bdee-29507d5ed72d","Type":"ContainerStarted","Data":"c6e6090ed33e860aee021296adb27576e1462565ddb08c21d2dd1c5bfded4100"} Apr 16 19:55:20.269499 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.269474 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5" event={"ID":"0c00b183-44d6-4f42-be34-b7f63056fa91","Type":"ContainerStarted","Data":"0ed8511c022c2ebe6ae5a1adc4003e2ab77019351d38a65a7d83e48fb383df0e"} Apr 16 19:55:20.269799 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.269777 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5" Apr 16 19:55:20.272020 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.271994 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" event={"ID":"824d29df-c9d9-42ca-b9b1-94f18a2e17ee","Type":"ContainerStarted","Data":"16fefbe2a8f38bb644a2d83487c41fdc303d67ba70cd11ab4953f6cddc17d36c"} Apr 16 19:55:20.272112 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.272030 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" event={"ID":"824d29df-c9d9-42ca-b9b1-94f18a2e17ee","Type":"ContainerStarted","Data":"9b5f928b852447fca508d5a224fcc7f4de6d282aa9a564fb69e4a7de2bf57bd3"} Apr 16 19:55:20.272112 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.272050 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" event={"ID":"824d29df-c9d9-42ca-b9b1-94f18a2e17ee","Type":"ContainerStarted","Data":"986eff2ab5e71cb03a44820783b4c9c6aaaf4dbc54b8a4f2e6f8dd7b5c4df1d8"} Apr 16 19:55:20.274825 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.274776 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dpc5h" podStartSLOduration=70.428744912 podStartE2EDuration="1m19.274762937s" podCreationTimestamp="2026-04-16 19:54:01 +0000 UTC" firstStartedPulling="2026-04-16 19:55:10.651766338 +0000 UTC m=+70.327768393" lastFinishedPulling="2026-04-16 19:55:19.497784353 +0000 UTC m=+79.173786418" observedRunningTime="2026-04-16 19:55:20.273651219 +0000 UTC m=+79.949653308" watchObservedRunningTime="2026-04-16 19:55:20.274762937 +0000 UTC m=+79.950765014" Apr 16 19:55:20.275607 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.275579 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d6ddbc884-hfkkp" event={"ID":"fe03b8be-957c-4a17-9b1a-6605af5d7e6c","Type":"ContainerStarted","Data":"73122835c74b19250ec00c8d6359cf4a9e2dec0d0c937c28572b9c2d07c59695"} Apr 16 19:55:20.276592 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.276540 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5" Apr 16 19:55:20.293188 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.289752 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfvs5" podStartSLOduration=5.438676516 podStartE2EDuration="14.289737474s" podCreationTimestamp="2026-04-16 19:55:06 +0000 UTC" firstStartedPulling="2026-04-16 19:55:10.646257733 +0000 UTC m=+70.322259795" lastFinishedPulling="2026-04-16 19:55:19.4973187 +0000 UTC m=+79.173320753" observedRunningTime="2026-04-16 19:55:20.288045746 +0000 UTC m=+79.964047824" watchObservedRunningTime="2026-04-16 19:55:20.289737474 +0000 UTC m=+79.965739551" Apr 16 19:55:20.309667 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.309621 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5b7cd7d77f-2jlnv" podStartSLOduration=5.453140948 podStartE2EDuration="14.309606198s" podCreationTimestamp="2026-04-16 19:55:06 +0000 UTC" firstStartedPulling="2026-04-16 19:55:10.64114264 +0000 UTC m=+70.317144705" lastFinishedPulling="2026-04-16 19:55:19.497607895 +0000 UTC m=+79.173609955" observedRunningTime="2026-04-16 19:55:20.307915555 +0000 UTC m=+79.983917621" watchObservedRunningTime="2026-04-16 19:55:20.309606198 +0000 UTC m=+79.985608278" Apr 16 19:55:20.326857 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.326810 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-v62bb" podStartSLOduration=70.548013442 podStartE2EDuration="1m19.326797638s" podCreationTimestamp="2026-04-16 19:54:01 +0000 UTC" firstStartedPulling="2026-04-16 19:55:10.654441206 +0000 UTC m=+70.330443264" lastFinishedPulling="2026-04-16 19:55:19.433225399 +0000 UTC m=+79.109227460" observedRunningTime="2026-04-16 19:55:20.325374485 +0000 UTC m=+80.001376585" watchObservedRunningTime="2026-04-16 19:55:20.326797638 +0000 UTC m=+80.002799713" Apr 16 19:55:20.377232 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.377159 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" podStartSLOduration=6.523162647 podStartE2EDuration="15.377141161s" podCreationTimestamp="2026-04-16 19:55:05 +0000 UTC" firstStartedPulling="2026-04-16 19:55:10.645204115 +0000 UTC m=+70.321206170" lastFinishedPulling="2026-04-16 19:55:19.49918263 +0000 UTC m=+79.175184684" observedRunningTime="2026-04-16 19:55:20.375939913 +0000 UTC m=+80.051942005" watchObservedRunningTime="2026-04-16 19:55:20.377141161 +0000 UTC m=+80.053143238" Apr 16 19:55:20.406273 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.406216 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-97w44" podStartSLOduration=10.589973245 podStartE2EDuration="19.406197119s" podCreationTimestamp="2026-04-16 19:55:01 +0000 UTC" firstStartedPulling="2026-04-16 19:55:02.256209092 +0000 UTC m=+61.932211152" lastFinishedPulling="2026-04-16 19:55:11.072432968 +0000 UTC m=+70.748435026" observedRunningTime="2026-04-16 19:55:20.404552991 +0000 UTC m=+80.080555091" watchObservedRunningTime="2026-04-16 19:55:20.406197119 +0000 UTC m=+80.082199187" Apr 16 19:55:20.425422 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.425362 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d6ddbc884-hfkkp" podStartSLOduration=2.2013054260000002 podStartE2EDuration="10.425343822s" podCreationTimestamp="2026-04-16 19:55:10 +0000 UTC" firstStartedPulling="2026-04-16 19:55:11.282057204 +0000 UTC m=+70.958059258" lastFinishedPulling="2026-04-16 19:55:19.506095597 +0000 UTC m=+79.182097654" observedRunningTime="2026-04-16 19:55:20.424774411 +0000 UTC m=+80.100776539" watchObservedRunningTime="2026-04-16 19:55:20.425343822 +0000 UTC m=+80.101345900" Apr 16 19:55:20.970691 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.970330 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:20.970691 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.970647 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:20.978027 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:20.977836 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:21.282628 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.282602 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:21.405283 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.405249 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-df5f56696-pczr2"] Apr 16 19:55:21.427435 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.427407 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-df5f56696-pczr2"] Apr 16 19:55:21.427594 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.427509 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.518714 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.518680 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-console-config\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.518924 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.518745 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3edd642e-36e5-4b56-842b-132d571bb17b-console-serving-cert\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.518924 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.518791 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-trusted-ca-bundle\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.518924 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.518843 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-oauth-serving-cert\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.518924 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.518895 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3edd642e-36e5-4b56-842b-132d571bb17b-console-oauth-config\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.518924 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.518920 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-service-ca\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.519153 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.518962 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvwqb\" (UniqueName: \"kubernetes.io/projected/3edd642e-36e5-4b56-842b-132d571bb17b-kube-api-access-mvwqb\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.620335 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.620258 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3edd642e-36e5-4b56-842b-132d571bb17b-console-serving-cert\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.620335 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.620307 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-trusted-ca-bundle\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.620603 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.620359 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-oauth-serving-cert\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.620603 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.620408 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3edd642e-36e5-4b56-842b-132d571bb17b-console-oauth-config\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.620603 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.620456 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-service-ca\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.620603 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.620503 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvwqb\" (UniqueName: \"kubernetes.io/projected/3edd642e-36e5-4b56-842b-132d571bb17b-kube-api-access-mvwqb\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.620603 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.620549 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-console-config\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.621271 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.621235 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-oauth-serving-cert\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.621413 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.621338 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-service-ca\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.621602 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.621579 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-trusted-ca-bundle\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.621602 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.621586 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-console-config\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.622867 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.622844 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3edd642e-36e5-4b56-842b-132d571bb17b-console-serving-cert\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.623073 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.623055 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3edd642e-36e5-4b56-842b-132d571bb17b-console-oauth-config\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.634527 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.634497 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvwqb\" (UniqueName: \"kubernetes.io/projected/3edd642e-36e5-4b56-842b-132d571bb17b-kube-api-access-mvwqb\") pod \"console-df5f56696-pczr2\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.738963 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.738937 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:21.860490 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:21.860466 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-df5f56696-pczr2"] Apr 16 19:55:21.862752 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:55:21.862731 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3edd642e_36e5_4b56_842b_132d571bb17b.slice/crio-bc5aecb7c9cb4cb274ef5b6b64507794b627887ffb08690e427b14f25bbb7c65 WatchSource:0}: Error finding container bc5aecb7c9cb4cb274ef5b6b64507794b627887ffb08690e427b14f25bbb7c65: Status 404 returned error can't find the container with id bc5aecb7c9cb4cb274ef5b6b64507794b627887ffb08690e427b14f25bbb7c65 Apr 16 19:55:22.283604 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:22.283570 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerStarted","Data":"4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6"} Apr 16 19:55:22.285025 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:22.285001 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df5f56696-pczr2" event={"ID":"3edd642e-36e5-4b56-842b-132d571bb17b","Type":"ContainerStarted","Data":"c7820b075150c3ed1f2d91f0cf26ec4ffdc94fdb115772026e2aef1048d090f7"} Apr 16 19:55:22.285141 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:22.285030 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df5f56696-pczr2" event={"ID":"3edd642e-36e5-4b56-842b-132d571bb17b","Type":"ContainerStarted","Data":"bc5aecb7c9cb4cb274ef5b6b64507794b627887ffb08690e427b14f25bbb7c65"} Apr 16 19:55:22.305244 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:22.305196 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-df5f56696-pczr2" podStartSLOduration=1.30515939 podStartE2EDuration="1.30515939s" podCreationTimestamp="2026-04-16 19:55:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:55:22.302978986 +0000 UTC m=+81.978981062" watchObservedRunningTime="2026-04-16 19:55:22.30515939 +0000 UTC m=+81.981161467" Apr 16 19:55:23.291963 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:23.291926 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerStarted","Data":"70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9"} Apr 16 19:55:23.291963 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:23.291967 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerStarted","Data":"52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e"} Apr 16 19:55:23.292491 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:23.291981 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerStarted","Data":"5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af"} Apr 16 19:55:23.292491 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:23.292027 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerStarted","Data":"ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6"} Apr 16 19:55:24.301851 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:24.301815 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerStarted","Data":"35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375"} Apr 16 19:55:24.337608 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:24.337554 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=9.115334645 podStartE2EDuration="22.337536777s" podCreationTimestamp="2026-04-16 19:55:02 +0000 UTC" firstStartedPulling="2026-04-16 19:55:10.659093194 +0000 UTC m=+70.335095248" lastFinishedPulling="2026-04-16 19:55:23.881295322 +0000 UTC m=+83.557297380" observedRunningTime="2026-04-16 19:55:24.335059681 +0000 UTC m=+84.011061756" watchObservedRunningTime="2026-04-16 19:55:24.337536777 +0000 UTC m=+84.013538854" Apr 16 19:55:26.085776 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:26.085748 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:26.086337 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:26.085789 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:31.739165 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:31.739134 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:31.739165 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:31.739187 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:31.743985 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:31.743962 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:32.329655 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:32.329614 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:55:32.382905 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:32.382874 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d6ddbc884-hfkkp"] Apr 16 19:55:37.039372 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:37.039340 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5xgdl_fded5762-e4ff-4f63-94bd-04c5209ebead/serve-healthcheck-canary/0.log" Apr 16 19:55:46.090758 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:46.090731 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:46.094716 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:46.094693 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7878fffd95-cgn9r" Apr 16 19:55:51.281405 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:51.281374 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dpc5h" Apr 16 19:55:57.410112 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.410062 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7d6ddbc884-hfkkp" podUID="fe03b8be-957c-4a17-9b1a-6605af5d7e6c" containerName="console" containerID="cri-o://73122835c74b19250ec00c8d6359cf4a9e2dec0d0c937c28572b9c2d07c59695" gracePeriod=15 Apr 16 19:55:57.718210 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.718189 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d6ddbc884-hfkkp_fe03b8be-957c-4a17-9b1a-6605af5d7e6c/console/0.log" Apr 16 19:55:57.718320 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.718256 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:57.798063 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.798038 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-service-ca\") pod \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " Apr 16 19:55:57.798211 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.798075 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-serving-cert\") pod \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " Apr 16 19:55:57.798211 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.798129 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8ccb\" (UniqueName: \"kubernetes.io/projected/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-kube-api-access-v8ccb\") pod \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " Apr 16 19:55:57.798211 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.798195 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-oauth-serving-cert\") pod \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " Apr 16 19:55:57.798377 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.798225 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-oauth-config\") pod \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " Apr 16 19:55:57.798439 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.798403 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-service-ca" (OuterVolumeSpecName: "service-ca") pod "fe03b8be-957c-4a17-9b1a-6605af5d7e6c" (UID: "fe03b8be-957c-4a17-9b1a-6605af5d7e6c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:57.798439 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.798410 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-trusted-ca-bundle\") pod \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " Apr 16 19:55:57.798541 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.798453 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-config\") pod \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\" (UID: \"fe03b8be-957c-4a17-9b1a-6605af5d7e6c\") " Apr 16 19:55:57.798674 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.798644 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fe03b8be-957c-4a17-9b1a-6605af5d7e6c" (UID: "fe03b8be-957c-4a17-9b1a-6605af5d7e6c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:57.798674 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.798664 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-service-ca\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:55:57.798791 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.798770 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fe03b8be-957c-4a17-9b1a-6605af5d7e6c" (UID: "fe03b8be-957c-4a17-9b1a-6605af5d7e6c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:57.798888 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.798864 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-config" (OuterVolumeSpecName: "console-config") pod "fe03b8be-957c-4a17-9b1a-6605af5d7e6c" (UID: "fe03b8be-957c-4a17-9b1a-6605af5d7e6c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:57.800464 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.800436 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-kube-api-access-v8ccb" (OuterVolumeSpecName: "kube-api-access-v8ccb") pod "fe03b8be-957c-4a17-9b1a-6605af5d7e6c" (UID: "fe03b8be-957c-4a17-9b1a-6605af5d7e6c"). InnerVolumeSpecName "kube-api-access-v8ccb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:57.800464 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.800456 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fe03b8be-957c-4a17-9b1a-6605af5d7e6c" (UID: "fe03b8be-957c-4a17-9b1a-6605af5d7e6c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:57.800591 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.800473 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fe03b8be-957c-4a17-9b1a-6605af5d7e6c" (UID: "fe03b8be-957c-4a17-9b1a-6605af5d7e6c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:57.899732 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.899707 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-serving-cert\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:55:57.899732 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.899729 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v8ccb\" (UniqueName: \"kubernetes.io/projected/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-kube-api-access-v8ccb\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:55:57.899870 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.899739 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-oauth-serving-cert\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:55:57.899870 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.899748 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-oauth-config\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:55:57.899870 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.899756 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-trusted-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:55:57.899870 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:57.899765 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe03b8be-957c-4a17-9b1a-6605af5d7e6c-console-config\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:55:58.396346 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:58.396321 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d6ddbc884-hfkkp_fe03b8be-957c-4a17-9b1a-6605af5d7e6c/console/0.log" Apr 16 19:55:58.396505 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:58.396357 2568 generic.go:358] "Generic (PLEG): container finished" podID="fe03b8be-957c-4a17-9b1a-6605af5d7e6c" containerID="73122835c74b19250ec00c8d6359cf4a9e2dec0d0c937c28572b9c2d07c59695" exitCode=2 Apr 16 19:55:58.396505 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:58.396410 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d6ddbc884-hfkkp" event={"ID":"fe03b8be-957c-4a17-9b1a-6605af5d7e6c","Type":"ContainerDied","Data":"73122835c74b19250ec00c8d6359cf4a9e2dec0d0c937c28572b9c2d07c59695"} Apr 16 19:55:58.396505 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:58.396418 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d6ddbc884-hfkkp" Apr 16 19:55:58.396505 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:58.396434 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d6ddbc884-hfkkp" event={"ID":"fe03b8be-957c-4a17-9b1a-6605af5d7e6c","Type":"ContainerDied","Data":"e5a773d2401f636a9d80f38c34cd829011277178e165631da4abdc1d6543a2e8"} Apr 16 19:55:58.396505 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:58.396453 2568 scope.go:117] "RemoveContainer" containerID="73122835c74b19250ec00c8d6359cf4a9e2dec0d0c937c28572b9c2d07c59695" Apr 16 19:55:58.406420 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:58.406337 2568 scope.go:117] "RemoveContainer" containerID="73122835c74b19250ec00c8d6359cf4a9e2dec0d0c937c28572b9c2d07c59695" Apr 16 19:55:58.406691 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:55:58.406662 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73122835c74b19250ec00c8d6359cf4a9e2dec0d0c937c28572b9c2d07c59695\": container with ID starting with 73122835c74b19250ec00c8d6359cf4a9e2dec0d0c937c28572b9c2d07c59695 not found: ID does not exist" containerID="73122835c74b19250ec00c8d6359cf4a9e2dec0d0c937c28572b9c2d07c59695" Apr 16 19:55:58.406788 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:58.406702 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73122835c74b19250ec00c8d6359cf4a9e2dec0d0c937c28572b9c2d07c59695"} err="failed to get container status \"73122835c74b19250ec00c8d6359cf4a9e2dec0d0c937c28572b9c2d07c59695\": rpc error: code = NotFound desc = could not find container \"73122835c74b19250ec00c8d6359cf4a9e2dec0d0c937c28572b9c2d07c59695\": container with ID starting with 73122835c74b19250ec00c8d6359cf4a9e2dec0d0c937c28572b9c2d07c59695 not found: ID does not exist" Apr 16 19:55:58.419163 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:58.419143 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d6ddbc884-hfkkp"] Apr 16 19:55:58.421466 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:58.421443 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d6ddbc884-hfkkp"] Apr 16 19:55:58.896658 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:55:58.896627 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe03b8be-957c-4a17-9b1a-6605af5d7e6c" path="/var/lib/kubelet/pods/fe03b8be-957c-4a17-9b1a-6605af5d7e6c/volumes" Apr 16 19:56:21.677870 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:21.677836 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:21.678522 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:21.678470 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="alertmanager" containerID="cri-o://4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6" gracePeriod=120 Apr 16 19:56:21.678643 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:21.678510 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="kube-rbac-proxy-web" containerID="cri-o://5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af" gracePeriod=120 Apr 16 19:56:21.678643 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:21.678528 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="config-reloader" containerID="cri-o://ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6" gracePeriod=120 Apr 16 19:56:21.678643 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:21.678550 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="prom-label-proxy" containerID="cri-o://35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375" gracePeriod=120 Apr 16 19:56:21.678643 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:21.678566 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="kube-rbac-proxy" containerID="cri-o://52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e" gracePeriod=120 Apr 16 19:56:21.678643 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:21.678608 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="kube-rbac-proxy-metric" containerID="cri-o://70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9" gracePeriod=120 Apr 16 19:56:22.462508 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.462475 2568 generic.go:358] "Generic (PLEG): container finished" podID="b8413c40-c771-4971-b93c-eee3e92225b3" containerID="35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375" exitCode=0 Apr 16 19:56:22.462508 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.462499 2568 generic.go:358] "Generic (PLEG): container finished" podID="b8413c40-c771-4971-b93c-eee3e92225b3" containerID="52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e" exitCode=0 Apr 16 19:56:22.462508 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.462506 2568 generic.go:358] "Generic (PLEG): container finished" podID="b8413c40-c771-4971-b93c-eee3e92225b3" containerID="ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6" exitCode=0 Apr 16 19:56:22.462508 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.462511 2568 generic.go:358] "Generic (PLEG): container finished" podID="b8413c40-c771-4971-b93c-eee3e92225b3" containerID="4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6" exitCode=0 Apr 16 19:56:22.462768 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.462543 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerDied","Data":"35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375"} Apr 16 19:56:22.462768 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.462578 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerDied","Data":"52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e"} Apr 16 19:56:22.462768 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.462588 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerDied","Data":"ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6"} Apr 16 19:56:22.462768 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.462597 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerDied","Data":"4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6"} Apr 16 19:56:22.916992 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.916972 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:22.973373 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973342 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-main-tls\") pod \"b8413c40-c771-4971-b93c-eee3e92225b3\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " Apr 16 19:56:22.973538 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973402 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-trusted-ca-bundle\") pod \"b8413c40-c771-4971-b93c-eee3e92225b3\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " Apr 16 19:56:22.973538 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973445 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-main-db\") pod \"b8413c40-c771-4971-b93c-eee3e92225b3\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " Apr 16 19:56:22.973538 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973471 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy\") pod \"b8413c40-c771-4971-b93c-eee3e92225b3\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " Apr 16 19:56:22.973538 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973497 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-config-volume\") pod \"b8413c40-c771-4971-b93c-eee3e92225b3\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " Apr 16 19:56:22.973538 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973527 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tz9s\" (UniqueName: \"kubernetes.io/projected/b8413c40-c771-4971-b93c-eee3e92225b3-kube-api-access-9tz9s\") pod \"b8413c40-c771-4971-b93c-eee3e92225b3\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " Apr 16 19:56:22.973805 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973559 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-cluster-tls-config\") pod \"b8413c40-c771-4971-b93c-eee3e92225b3\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " Apr 16 19:56:22.973805 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973589 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8413c40-c771-4971-b93c-eee3e92225b3-config-out\") pod \"b8413c40-c771-4971-b93c-eee3e92225b3\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " Apr 16 19:56:22.973805 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973621 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8413c40-c771-4971-b93c-eee3e92225b3-tls-assets\") pod \"b8413c40-c771-4971-b93c-eee3e92225b3\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " Apr 16 19:56:22.973805 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973664 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"b8413c40-c771-4971-b93c-eee3e92225b3\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " Apr 16 19:56:22.973805 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973691 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-metrics-client-ca\") pod \"b8413c40-c771-4971-b93c-eee3e92225b3\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " Apr 16 19:56:22.973805 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973739 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-web-config\") pod \"b8413c40-c771-4971-b93c-eee3e92225b3\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " Apr 16 19:56:22.973805 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973770 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "b8413c40-c771-4971-b93c-eee3e92225b3" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:22.973805 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973791 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"b8413c40-c771-4971-b93c-eee3e92225b3\" (UID: \"b8413c40-c771-4971-b93c-eee3e92225b3\") " Apr 16 19:56:22.974233 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.973806 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "b8413c40-c771-4971-b93c-eee3e92225b3" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:56:22.974233 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.974083 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:56:22.974233 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.974101 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b8413c40-c771-4971-b93c-eee3e92225b3-alertmanager-main-db\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:56:22.974736 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.974711 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "b8413c40-c771-4971-b93c-eee3e92225b3" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:22.976929 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.976902 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "b8413c40-c771-4971-b93c-eee3e92225b3" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:22.977717 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.977686 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "b8413c40-c771-4971-b93c-eee3e92225b3" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:22.977874 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.977839 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8413c40-c771-4971-b93c-eee3e92225b3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b8413c40-c771-4971-b93c-eee3e92225b3" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:22.977967 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.977920 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "b8413c40-c771-4971-b93c-eee3e92225b3" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:22.977967 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.977933 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8413c40-c771-4971-b93c-eee3e92225b3-config-out" (OuterVolumeSpecName: "config-out") pod "b8413c40-c771-4971-b93c-eee3e92225b3" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:56:22.978073 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.977950 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-config-volume" (OuterVolumeSpecName: "config-volume") pod "b8413c40-c771-4971-b93c-eee3e92225b3" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:22.978073 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.977999 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "b8413c40-c771-4971-b93c-eee3e92225b3" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:22.978220 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.978197 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8413c40-c771-4971-b93c-eee3e92225b3-kube-api-access-9tz9s" (OuterVolumeSpecName: "kube-api-access-9tz9s") pod "b8413c40-c771-4971-b93c-eee3e92225b3" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3"). InnerVolumeSpecName "kube-api-access-9tz9s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:22.983664 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.983639 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "b8413c40-c771-4971-b93c-eee3e92225b3" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:22.989415 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:22.989364 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-web-config" (OuterVolumeSpecName: "web-config") pod "b8413c40-c771-4971-b93c-eee3e92225b3" (UID: "b8413c40-c771-4971-b93c-eee3e92225b3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:23.074842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.074819 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:56:23.074842 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.074842 2568 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8413c40-c771-4971-b93c-eee3e92225b3-metrics-client-ca\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:56:23.074969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.074853 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-web-config\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:56:23.074969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.074864 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:56:23.074969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.074874 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-main-tls\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:56:23.074969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.074883 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:56:23.074969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.074892 2568 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-config-volume\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:56:23.074969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.074901 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9tz9s\" (UniqueName: \"kubernetes.io/projected/b8413c40-c771-4971-b93c-eee3e92225b3-kube-api-access-9tz9s\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:56:23.074969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.074910 2568 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b8413c40-c771-4971-b93c-eee3e92225b3-cluster-tls-config\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:56:23.074969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.074918 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8413c40-c771-4971-b93c-eee3e92225b3-config-out\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:56:23.074969 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.074926 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8413c40-c771-4971-b93c-eee3e92225b3-tls-assets\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:56:23.467512 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.467483 2568 generic.go:358] "Generic (PLEG): container finished" podID="b8413c40-c771-4971-b93c-eee3e92225b3" containerID="70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9" exitCode=0 Apr 16 19:56:23.467512 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.467506 2568 generic.go:358] "Generic (PLEG): container finished" podID="b8413c40-c771-4971-b93c-eee3e92225b3" containerID="5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af" exitCode=0 Apr 16 19:56:23.467701 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.467577 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerDied","Data":"70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9"} Apr 16 19:56:23.467701 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.467616 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerDied","Data":"5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af"} Apr 16 19:56:23.467701 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.467628 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b8413c40-c771-4971-b93c-eee3e92225b3","Type":"ContainerDied","Data":"a4a33d953450e55a7d39d7e3bf8ea51347734ce44d5ec6c38a77b0b3e797a8bb"} Apr 16 19:56:23.467701 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.467643 2568 scope.go:117] "RemoveContainer" containerID="35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375" Apr 16 19:56:23.467701 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.467586 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.475522 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.475507 2568 scope.go:117] "RemoveContainer" containerID="70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9" Apr 16 19:56:23.481738 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.481723 2568 scope.go:117] "RemoveContainer" containerID="52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e" Apr 16 19:56:23.487695 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.487678 2568 scope.go:117] "RemoveContainer" containerID="5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af" Apr 16 19:56:23.490743 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.490723 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:23.494214 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.494190 2568 scope.go:117] "RemoveContainer" containerID="ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6" Apr 16 19:56:23.500030 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.500016 2568 scope.go:117] "RemoveContainer" containerID="4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6" Apr 16 19:56:23.507677 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.507656 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:23.509419 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.509406 2568 scope.go:117] "RemoveContainer" containerID="133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8" Apr 16 19:56:23.515234 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.515219 2568 scope.go:117] "RemoveContainer" containerID="35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375" Apr 16 19:56:23.515455 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:56:23.515436 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375\": container with ID starting with 35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375 not found: ID does not exist" containerID="35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375" Apr 16 19:56:23.515503 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.515463 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375"} err="failed to get container status \"35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375\": rpc error: code = NotFound desc = could not find container \"35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375\": container with ID starting with 35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375 not found: ID does not exist" Apr 16 19:56:23.515503 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.515479 2568 scope.go:117] "RemoveContainer" containerID="70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9" Apr 16 19:56:23.515672 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:56:23.515656 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9\": container with ID starting with 70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9 not found: ID does not exist" containerID="70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9" Apr 16 19:56:23.515712 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.515676 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9"} err="failed to get container status \"70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9\": rpc error: code = NotFound desc = could not find container \"70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9\": container with ID starting with 70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9 not found: ID does not exist" Apr 16 19:56:23.515712 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.515689 2568 scope.go:117] "RemoveContainer" containerID="52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e" Apr 16 19:56:23.515924 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:56:23.515904 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e\": container with ID starting with 52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e not found: ID does not exist" containerID="52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e" Apr 16 19:56:23.515962 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.515931 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e"} err="failed to get container status \"52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e\": rpc error: code = NotFound desc = could not find container \"52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e\": container with ID starting with 52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e not found: ID does not exist" Apr 16 19:56:23.515962 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.515947 2568 scope.go:117] "RemoveContainer" containerID="5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af" Apr 16 19:56:23.516163 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:56:23.516146 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af\": container with ID starting with 5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af not found: ID does not exist" containerID="5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af" Apr 16 19:56:23.516273 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.516255 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af"} err="failed to get container status \"5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af\": rpc error: code = NotFound desc = could not find container \"5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af\": container with ID starting with 5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af not found: ID does not exist" Apr 16 19:56:23.516321 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.516275 2568 scope.go:117] "RemoveContainer" containerID="ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6" Apr 16 19:56:23.516504 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:56:23.516490 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6\": container with ID starting with ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6 not found: ID does not exist" containerID="ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6" Apr 16 19:56:23.516548 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.516507 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6"} err="failed to get container status \"ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6\": rpc error: code = NotFound desc = could not find container \"ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6\": container with ID starting with ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6 not found: ID does not exist" Apr 16 19:56:23.516548 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.516542 2568 scope.go:117] "RemoveContainer" containerID="4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6" Apr 16 19:56:23.516745 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:56:23.516728 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6\": container with ID starting with 4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6 not found: ID does not exist" containerID="4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6" Apr 16 19:56:23.516795 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.516750 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6"} err="failed to get container status \"4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6\": rpc error: code = NotFound desc = could not find container \"4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6\": container with ID starting with 4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6 not found: ID does not exist" Apr 16 19:56:23.516795 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.516763 2568 scope.go:117] "RemoveContainer" containerID="133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8" Apr 16 19:56:23.516945 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:56:23.516931 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8\": container with ID starting with 133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8 not found: ID does not exist" containerID="133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8" Apr 16 19:56:23.516980 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.516949 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8"} err="failed to get container status \"133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8\": rpc error: code = NotFound desc = could not find container \"133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8\": container with ID starting with 133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8 not found: ID does not exist" Apr 16 19:56:23.516980 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.516962 2568 scope.go:117] "RemoveContainer" containerID="35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375" Apr 16 19:56:23.517128 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.517111 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375"} err="failed to get container status \"35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375\": rpc error: code = NotFound desc = could not find container \"35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375\": container with ID starting with 35a4255603246ed418f32fb2247406b532ac410f6592834d22059f252b05e375 not found: ID does not exist" Apr 16 19:56:23.517164 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.517128 2568 scope.go:117] "RemoveContainer" containerID="70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9" Apr 16 19:56:23.517313 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.517297 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9"} err="failed to get container status \"70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9\": rpc error: code = NotFound desc = could not find container \"70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9\": container with ID starting with 70c5f55f88778ab0d41290bc47432033a97b989d1181386a40cd0437304285c9 not found: ID does not exist" Apr 16 19:56:23.517353 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.517314 2568 scope.go:117] "RemoveContainer" containerID="52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e" Apr 16 19:56:23.517493 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.517478 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e"} err="failed to get container status \"52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e\": rpc error: code = NotFound desc = could not find container \"52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e\": container with ID starting with 52b2b412a4707f38fedbdaf9dc97d260afe02be251f0450624e32697e2092e8e not found: ID does not exist" Apr 16 19:56:23.517538 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.517493 2568 scope.go:117] "RemoveContainer" containerID="5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af" Apr 16 19:56:23.517695 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.517678 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af"} err="failed to get container status \"5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af\": rpc error: code = NotFound desc = could not find container \"5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af\": container with ID starting with 5510225eaab3480ebb139b2ea48455f65ed1aa7d3b6bf79750a3e265fe8465af not found: ID does not exist" Apr 16 19:56:23.517740 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.517696 2568 scope.go:117] "RemoveContainer" containerID="ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6" Apr 16 19:56:23.517876 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.517860 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6"} err="failed to get container status \"ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6\": rpc error: code = NotFound desc = could not find container \"ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6\": container with ID starting with ba871c19b4a8275b043826fed420d5cef668d6ece552e61e284ab3734e969cb6 not found: ID does not exist" Apr 16 19:56:23.517918 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.517876 2568 scope.go:117] "RemoveContainer" containerID="4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6" Apr 16 19:56:23.518067 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.518048 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6"} err="failed to get container status \"4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6\": rpc error: code = NotFound desc = could not find container \"4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6\": container with ID starting with 4ff096b903a3ee56f7eca6c59007018f7daf1d499466f9827a8f48f7a5bf3eb6 not found: ID does not exist" Apr 16 19:56:23.518109 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.518068 2568 scope.go:117] "RemoveContainer" containerID="133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8" Apr 16 19:56:23.518276 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.518258 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8"} err="failed to get container status \"133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8\": rpc error: code = NotFound desc = could not find container \"133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8\": container with ID starting with 133e6ae65db8b990f23a8c4e50664f487163f14e70bdf86b97678d81312738b8 not found: ID does not exist" Apr 16 19:56:23.529089 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529072 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:23.529333 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529319 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="alertmanager" Apr 16 19:56:23.529389 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529336 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="alertmanager" Apr 16 19:56:23.529389 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529354 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="kube-rbac-proxy-metric" Apr 16 19:56:23.529389 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529361 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="kube-rbac-proxy-metric" Apr 16 19:56:23.529389 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529369 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="prom-label-proxy" Apr 16 19:56:23.529389 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529375 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="prom-label-proxy" Apr 16 19:56:23.529389 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529381 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="config-reloader" Apr 16 19:56:23.529389 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529386 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="config-reloader" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529396 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="kube-rbac-proxy-web" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529402 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="kube-rbac-proxy-web" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529408 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="kube-rbac-proxy" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529412 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="kube-rbac-proxy" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529417 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="init-config-reloader" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529423 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="init-config-reloader" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529429 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe03b8be-957c-4a17-9b1a-6605af5d7e6c" containerName="console" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529434 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe03b8be-957c-4a17-9b1a-6605af5d7e6c" containerName="console" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529476 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe03b8be-957c-4a17-9b1a-6605af5d7e6c" containerName="console" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529483 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="kube-rbac-proxy-web" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529490 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="kube-rbac-proxy" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529496 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="prom-label-proxy" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529503 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="config-reloader" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529509 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="kube-rbac-proxy-metric" Apr 16 19:56:23.529624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.529516 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" containerName="alertmanager" Apr 16 19:56:23.534271 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.534254 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.536919 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.536899 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 19:56:23.537005 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.536917 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 19:56:23.537189 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.537147 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 19:56:23.537189 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.537152 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 19:56:23.537341 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.537198 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 19:56:23.537341 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.537255 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 19:56:23.537341 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.537304 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-p6pfn\"" Apr 16 19:56:23.537588 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.537570 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 19:56:23.537635 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.537595 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 19:56:23.542287 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.542267 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 19:56:23.545817 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.545799 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:23.678382 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.678347 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.678560 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.678391 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.678560 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.678452 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-web-config\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.678560 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.678490 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0adcc642-de3f-4474-9dc5-282398ec9c9e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.678560 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.678525 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0adcc642-de3f-4474-9dc5-282398ec9c9e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.678560 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.678557 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0adcc642-de3f-4474-9dc5-282398ec9c9e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.678824 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.678591 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmvjt\" (UniqueName: \"kubernetes.io/projected/0adcc642-de3f-4474-9dc5-282398ec9c9e-kube-api-access-tmvjt\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.678824 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.678622 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0adcc642-de3f-4474-9dc5-282398ec9c9e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.678824 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.678651 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0adcc642-de3f-4474-9dc5-282398ec9c9e-config-out\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.678824 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.678685 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-config-volume\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.678824 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.678790 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.679000 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.678823 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.679000 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.678857 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.779949 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.779923 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.780077 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.779957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.780077 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.779979 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.780209 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.780120 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.780209 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.780189 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.780318 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.780232 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-web-config\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.780318 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.780260 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0adcc642-de3f-4474-9dc5-282398ec9c9e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.780411 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.780379 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0adcc642-de3f-4474-9dc5-282398ec9c9e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.780465 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.780422 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0adcc642-de3f-4474-9dc5-282398ec9c9e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.780515 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.780461 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmvjt\" (UniqueName: \"kubernetes.io/projected/0adcc642-de3f-4474-9dc5-282398ec9c9e-kube-api-access-tmvjt\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.780515 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.780495 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0adcc642-de3f-4474-9dc5-282398ec9c9e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.780615 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.780523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0adcc642-de3f-4474-9dc5-282398ec9c9e-config-out\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.780615 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.780559 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-config-volume\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.780709 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.780611 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0adcc642-de3f-4474-9dc5-282398ec9c9e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.781553 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.781527 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0adcc642-de3f-4474-9dc5-282398ec9c9e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.782587 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.782556 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0adcc642-de3f-4474-9dc5-282398ec9c9e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.783191 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.782991 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.783191 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.783082 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.783191 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.783109 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.783380 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.783198 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.783380 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.783273 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-config-volume\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.783553 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.783534 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.784070 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.784051 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0adcc642-de3f-4474-9dc5-282398ec9c9e-web-config\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.784738 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.784718 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0adcc642-de3f-4474-9dc5-282398ec9c9e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.784879 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.784862 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0adcc642-de3f-4474-9dc5-282398ec9c9e-config-out\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.788621 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.788601 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmvjt\" (UniqueName: \"kubernetes.io/projected/0adcc642-de3f-4474-9dc5-282398ec9c9e-kube-api-access-tmvjt\") pod \"alertmanager-main-0\" (UID: \"0adcc642-de3f-4474-9dc5-282398ec9c9e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.843582 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.843558 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:56:23.949711 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.949677 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c68c599bb-zmf8r"] Apr 16 19:56:23.954304 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.954285 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:23.960388 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.960361 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c68c599bb-zmf8r"] Apr 16 19:56:23.970086 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:23.970066 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:56:23.973658 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:56:23.973635 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0adcc642_de3f_4474_9dc5_282398ec9c9e.slice/crio-7ff450e60ee64e350f2a7aa40828a9faa5718e51359922be587e2bf51977a0f9 WatchSource:0}: Error finding container 7ff450e60ee64e350f2a7aa40828a9faa5718e51359922be587e2bf51977a0f9: Status 404 returned error can't find the container with id 7ff450e60ee64e350f2a7aa40828a9faa5718e51359922be587e2bf51977a0f9 Apr 16 19:56:24.082944 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.082916 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s98mv\" (UniqueName: \"kubernetes.io/projected/7dc19d41-37dc-4185-a862-b53f4d19cd10-kube-api-access-s98mv\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.083042 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.082952 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-oauth-config\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.083042 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.082978 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-oauth-serving-cert\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.083112 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.083044 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-config\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.083112 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.083097 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-serving-cert\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.083194 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.083130 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-trusted-ca-bundle\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.083194 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.083146 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-service-ca\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.183992 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.183959 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-oauth-serving-cert\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.184101 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.184013 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-config\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.184101 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.184048 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-serving-cert\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.184101 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.184075 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-trusted-ca-bundle\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.184275 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.184098 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-service-ca\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.184275 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.184135 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s98mv\" (UniqueName: \"kubernetes.io/projected/7dc19d41-37dc-4185-a862-b53f4d19cd10-kube-api-access-s98mv\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.184275 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.184161 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-oauth-config\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.184778 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.184757 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-oauth-serving-cert\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.184897 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.184871 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-service-ca\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.184977 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.184906 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-config\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.184977 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.184972 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-trusted-ca-bundle\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.186424 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.186395 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-serving-cert\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.186424 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.186416 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-oauth-config\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.191556 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.191537 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98mv\" (UniqueName: \"kubernetes.io/projected/7dc19d41-37dc-4185-a862-b53f4d19cd10-kube-api-access-s98mv\") pod \"console-5c68c599bb-zmf8r\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.264223 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.264203 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:24.375360 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.375335 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c68c599bb-zmf8r"] Apr 16 19:56:24.377058 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:56:24.377036 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dc19d41_37dc_4185_a862_b53f4d19cd10.slice/crio-dac30c4610a75e86478cbae560b8c9cd92e194b2d9f864b83272be6c9b63fa71 WatchSource:0}: Error finding container dac30c4610a75e86478cbae560b8c9cd92e194b2d9f864b83272be6c9b63fa71: Status 404 returned error can't find the container with id dac30c4610a75e86478cbae560b8c9cd92e194b2d9f864b83272be6c9b63fa71 Apr 16 19:56:24.476866 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.476829 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c68c599bb-zmf8r" event={"ID":"7dc19d41-37dc-4185-a862-b53f4d19cd10","Type":"ContainerStarted","Data":"1c1b438d514db0bfad03248bb9c599e1535dd5f7095fff76c5874731efd778db"} Apr 16 19:56:24.477019 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.476870 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c68c599bb-zmf8r" event={"ID":"7dc19d41-37dc-4185-a862-b53f4d19cd10","Type":"ContainerStarted","Data":"dac30c4610a75e86478cbae560b8c9cd92e194b2d9f864b83272be6c9b63fa71"} Apr 16 19:56:24.478243 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.478217 2568 generic.go:358] "Generic (PLEG): container finished" podID="0adcc642-de3f-4474-9dc5-282398ec9c9e" containerID="e605cfad87d590beae1d7d02d028985b1a144107a197968050b88fcec5d1312d" exitCode=0 Apr 16 19:56:24.478343 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.478266 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0adcc642-de3f-4474-9dc5-282398ec9c9e","Type":"ContainerDied","Data":"e605cfad87d590beae1d7d02d028985b1a144107a197968050b88fcec5d1312d"} Apr 16 19:56:24.478343 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.478283 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0adcc642-de3f-4474-9dc5-282398ec9c9e","Type":"ContainerStarted","Data":"7ff450e60ee64e350f2a7aa40828a9faa5718e51359922be587e2bf51977a0f9"} Apr 16 19:56:24.493882 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.493843 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c68c599bb-zmf8r" podStartSLOduration=1.493830759 podStartE2EDuration="1.493830759s" podCreationTimestamp="2026-04-16 19:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:56:24.49273452 +0000 UTC m=+144.168736596" watchObservedRunningTime="2026-04-16 19:56:24.493830759 +0000 UTC m=+144.169832835" Apr 16 19:56:24.897128 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:24.897097 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8413c40-c771-4971-b93c-eee3e92225b3" path="/var/lib/kubelet/pods/b8413c40-c771-4971-b93c-eee3e92225b3/volumes" Apr 16 19:56:25.485982 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:25.485944 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0adcc642-de3f-4474-9dc5-282398ec9c9e","Type":"ContainerStarted","Data":"d9d0df42b78684ac7a01fb45f062460ad8e010624cc0ff8e60d62b88dcb5bed2"} Apr 16 19:56:25.485982 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:25.485985 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0adcc642-de3f-4474-9dc5-282398ec9c9e","Type":"ContainerStarted","Data":"b2356586adf8b78c8c21fc1b7a8deae19ec52c958982517aa8197666ed534dd6"} Apr 16 19:56:25.486504 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:25.485996 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0adcc642-de3f-4474-9dc5-282398ec9c9e","Type":"ContainerStarted","Data":"d2ba79c68080e34234689f012b7ca486d5767bea358bdb97e3dde98a45cbe0f3"} Apr 16 19:56:25.486504 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:25.486004 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0adcc642-de3f-4474-9dc5-282398ec9c9e","Type":"ContainerStarted","Data":"682f4560ed136d715fc93822acef710c6269bb762054f39f54da2dc6210c4512"} Apr 16 19:56:25.486504 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:25.486012 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0adcc642-de3f-4474-9dc5-282398ec9c9e","Type":"ContainerStarted","Data":"2862ee7e0a3dded8360318276467db9eb9b6289c346f799c498650785691007c"} Apr 16 19:56:25.486504 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:25.486021 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0adcc642-de3f-4474-9dc5-282398ec9c9e","Type":"ContainerStarted","Data":"891db8d9c989d7c746637dd01dfb16d6f84f87a43f5d636db4a8979d8063753e"} Apr 16 19:56:25.513242 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:25.513202 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.513187908 podStartE2EDuration="2.513187908s" podCreationTimestamp="2026-04-16 19:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:56:25.511065149 +0000 UTC m=+145.187067225" watchObservedRunningTime="2026-04-16 19:56:25.513187908 +0000 UTC m=+145.189189978" Apr 16 19:56:34.264620 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:34.264588 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:34.265054 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:34.264682 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:34.269147 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:34.269128 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:34.515189 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:34.515089 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 19:56:34.557902 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:34.557868 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-df5f56696-pczr2"] Apr 16 19:56:56.897717 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:56.897646 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-2dpz7"] Apr 16 19:56:56.900759 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:56.900743 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2dpz7" Apr 16 19:56:56.903474 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:56.903451 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:56:56.905180 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:56.905146 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2dpz7"] Apr 16 19:56:57.027929 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:57.027901 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a-kubelet-config\") pod \"global-pull-secret-syncer-2dpz7\" (UID: \"6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a\") " pod="kube-system/global-pull-secret-syncer-2dpz7" Apr 16 19:56:57.027929 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:57.027935 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a-dbus\") pod \"global-pull-secret-syncer-2dpz7\" (UID: \"6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a\") " pod="kube-system/global-pull-secret-syncer-2dpz7" Apr 16 19:56:57.028137 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:57.027970 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a-original-pull-secret\") pod \"global-pull-secret-syncer-2dpz7\" (UID: \"6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a\") " pod="kube-system/global-pull-secret-syncer-2dpz7" Apr 16 19:56:57.129240 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:57.129201 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a-kubelet-config\") pod \"global-pull-secret-syncer-2dpz7\" (UID: \"6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a\") " pod="kube-system/global-pull-secret-syncer-2dpz7" Apr 16 19:56:57.129240 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:57.129242 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a-dbus\") pod \"global-pull-secret-syncer-2dpz7\" (UID: \"6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a\") " pod="kube-system/global-pull-secret-syncer-2dpz7" Apr 16 19:56:57.129423 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:57.129266 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a-original-pull-secret\") pod \"global-pull-secret-syncer-2dpz7\" (UID: \"6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a\") " pod="kube-system/global-pull-secret-syncer-2dpz7" Apr 16 19:56:57.129423 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:57.129314 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a-kubelet-config\") pod \"global-pull-secret-syncer-2dpz7\" (UID: \"6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a\") " pod="kube-system/global-pull-secret-syncer-2dpz7" Apr 16 19:56:57.129423 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:57.129363 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a-dbus\") pod \"global-pull-secret-syncer-2dpz7\" (UID: \"6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a\") " pod="kube-system/global-pull-secret-syncer-2dpz7" Apr 16 19:56:57.131470 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:57.131453 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a-original-pull-secret\") pod \"global-pull-secret-syncer-2dpz7\" (UID: \"6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a\") " pod="kube-system/global-pull-secret-syncer-2dpz7" Apr 16 19:56:57.210788 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:57.210722 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2dpz7" Apr 16 19:56:57.328259 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:57.328234 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2dpz7"] Apr 16 19:56:57.330624 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:56:57.330593 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d82b3f2_f3ee_4cdc_98fe_a6b57d616a0a.slice/crio-42652271d525e4643745467778fcb7d24bcad59c185eadb0db9853016fb20275 WatchSource:0}: Error finding container 42652271d525e4643745467778fcb7d24bcad59c185eadb0db9853016fb20275: Status 404 returned error can't find the container with id 42652271d525e4643745467778fcb7d24bcad59c185eadb0db9853016fb20275 Apr 16 19:56:57.577222 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:57.577189 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2dpz7" event={"ID":"6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a","Type":"ContainerStarted","Data":"42652271d525e4643745467778fcb7d24bcad59c185eadb0db9853016fb20275"} Apr 16 19:56:59.577190 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:56:59.577117 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-df5f56696-pczr2" podUID="3edd642e-36e5-4b56-842b-132d571bb17b" containerName="console" containerID="cri-o://c7820b075150c3ed1f2d91f0cf26ec4ffdc94fdb115772026e2aef1048d090f7" gracePeriod=15 Apr 16 19:57:00.588326 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:00.588294 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-df5f56696-pczr2_3edd642e-36e5-4b56-842b-132d571bb17b/console/0.log" Apr 16 19:57:00.588805 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:00.588701 2568 generic.go:358] "Generic (PLEG): container finished" podID="3edd642e-36e5-4b56-842b-132d571bb17b" containerID="c7820b075150c3ed1f2d91f0cf26ec4ffdc94fdb115772026e2aef1048d090f7" exitCode=2 Apr 16 19:57:00.588805 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:00.588763 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df5f56696-pczr2" event={"ID":"3edd642e-36e5-4b56-842b-132d571bb17b","Type":"ContainerDied","Data":"c7820b075150c3ed1f2d91f0cf26ec4ffdc94fdb115772026e2aef1048d090f7"} Apr 16 19:57:01.117636 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.117617 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-df5f56696-pczr2_3edd642e-36e5-4b56-842b-132d571bb17b/console/0.log" Apr 16 19:57:01.117733 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.117676 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:57:01.265073 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265046 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-console-config\") pod \"3edd642e-36e5-4b56-842b-132d571bb17b\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " Apr 16 19:57:01.265208 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265090 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3edd642e-36e5-4b56-842b-132d571bb17b-console-serving-cert\") pod \"3edd642e-36e5-4b56-842b-132d571bb17b\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " Apr 16 19:57:01.265208 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265142 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-service-ca\") pod \"3edd642e-36e5-4b56-842b-132d571bb17b\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " Apr 16 19:57:01.265208 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265192 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-oauth-serving-cert\") pod \"3edd642e-36e5-4b56-842b-132d571bb17b\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " Apr 16 19:57:01.265329 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265231 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-trusted-ca-bundle\") pod \"3edd642e-36e5-4b56-842b-132d571bb17b\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " Apr 16 19:57:01.265405 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265384 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvwqb\" (UniqueName: \"kubernetes.io/projected/3edd642e-36e5-4b56-842b-132d571bb17b-kube-api-access-mvwqb\") pod \"3edd642e-36e5-4b56-842b-132d571bb17b\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " Apr 16 19:57:01.265506 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265435 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3edd642e-36e5-4b56-842b-132d571bb17b-console-oauth-config\") pod \"3edd642e-36e5-4b56-842b-132d571bb17b\" (UID: \"3edd642e-36e5-4b56-842b-132d571bb17b\") " Apr 16 19:57:01.265570 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265440 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-console-config" (OuterVolumeSpecName: "console-config") pod "3edd642e-36e5-4b56-842b-132d571bb17b" (UID: "3edd642e-36e5-4b56-842b-132d571bb17b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:01.265624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265555 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-service-ca" (OuterVolumeSpecName: "service-ca") pod "3edd642e-36e5-4b56-842b-132d571bb17b" (UID: "3edd642e-36e5-4b56-842b-132d571bb17b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:01.265679 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265612 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3edd642e-36e5-4b56-842b-132d571bb17b" (UID: "3edd642e-36e5-4b56-842b-132d571bb17b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:01.265679 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265630 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3edd642e-36e5-4b56-842b-132d571bb17b" (UID: "3edd642e-36e5-4b56-842b-132d571bb17b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:57:01.265779 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265751 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-console-config\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:57:01.265779 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265768 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-service-ca\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:57:01.265779 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265778 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-oauth-serving-cert\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:57:01.265891 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.265788 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3edd642e-36e5-4b56-842b-132d571bb17b-trusted-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:57:01.267463 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.267442 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edd642e-36e5-4b56-842b-132d571bb17b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3edd642e-36e5-4b56-842b-132d571bb17b" (UID: "3edd642e-36e5-4b56-842b-132d571bb17b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:01.267668 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.267647 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edd642e-36e5-4b56-842b-132d571bb17b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3edd642e-36e5-4b56-842b-132d571bb17b" (UID: "3edd642e-36e5-4b56-842b-132d571bb17b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:57:01.267668 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.267654 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3edd642e-36e5-4b56-842b-132d571bb17b-kube-api-access-mvwqb" (OuterVolumeSpecName: "kube-api-access-mvwqb") pod "3edd642e-36e5-4b56-842b-132d571bb17b" (UID: "3edd642e-36e5-4b56-842b-132d571bb17b"). InnerVolumeSpecName "kube-api-access-mvwqb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:57:01.367115 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.367034 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3edd642e-36e5-4b56-842b-132d571bb17b-console-serving-cert\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:57:01.367115 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.367064 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mvwqb\" (UniqueName: \"kubernetes.io/projected/3edd642e-36e5-4b56-842b-132d571bb17b-kube-api-access-mvwqb\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:57:01.367115 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.367074 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3edd642e-36e5-4b56-842b-132d571bb17b-console-oauth-config\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:57:01.593079 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.593037 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2dpz7" event={"ID":"6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a","Type":"ContainerStarted","Data":"3240911c717e3389997c1078b179820cd046944aad0c8928f5375daaca9f3d5c"} Apr 16 19:57:01.594273 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.594255 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-df5f56696-pczr2_3edd642e-36e5-4b56-842b-132d571bb17b/console/0.log" Apr 16 19:57:01.594381 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.594300 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-df5f56696-pczr2" event={"ID":"3edd642e-36e5-4b56-842b-132d571bb17b","Type":"ContainerDied","Data":"bc5aecb7c9cb4cb274ef5b6b64507794b627887ffb08690e427b14f25bbb7c65"} Apr 16 19:57:01.594381 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.594336 2568 scope.go:117] "RemoveContainer" containerID="c7820b075150c3ed1f2d91f0cf26ec4ffdc94fdb115772026e2aef1048d090f7" Apr 16 19:57:01.594381 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.594348 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-df5f56696-pczr2" Apr 16 19:57:01.608415 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.608371 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-2dpz7" podStartSLOduration=1.780544599 podStartE2EDuration="5.608349817s" podCreationTimestamp="2026-04-16 19:56:56 +0000 UTC" firstStartedPulling="2026-04-16 19:56:57.332119212 +0000 UTC m=+177.008121271" lastFinishedPulling="2026-04-16 19:57:01.159924432 +0000 UTC m=+180.835926489" observedRunningTime="2026-04-16 19:57:01.607765438 +0000 UTC m=+181.283767515" watchObservedRunningTime="2026-04-16 19:57:01.608349817 +0000 UTC m=+181.284351896" Apr 16 19:57:01.621889 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.621836 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-df5f56696-pczr2"] Apr 16 19:57:01.624753 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:01.624730 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-df5f56696-pczr2"] Apr 16 19:57:02.896704 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:02.896674 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3edd642e-36e5-4b56-842b-132d571bb17b" path="/var/lib/kubelet/pods/3edd642e-36e5-4b56-842b-132d571bb17b/volumes" Apr 16 19:57:18.395037 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.395005 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb"] Apr 16 19:57:18.395425 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.395338 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3edd642e-36e5-4b56-842b-132d571bb17b" containerName="console" Apr 16 19:57:18.395425 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.395353 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3edd642e-36e5-4b56-842b-132d571bb17b" containerName="console" Apr 16 19:57:18.395425 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.395424 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="3edd642e-36e5-4b56-842b-132d571bb17b" containerName="console" Apr 16 19:57:18.398312 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.398293 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" Apr 16 19:57:18.400938 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.400914 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:57:18.402241 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.402222 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-2kmg5\"" Apr 16 19:57:18.402335 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.402222 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:57:18.406746 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.406720 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb"] Apr 16 19:57:18.487301 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.487276 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b01552df-8106-4d91-b438-fb0cf7d8fbb0-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb\" (UID: \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" Apr 16 19:57:18.487416 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.487308 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b01552df-8106-4d91-b438-fb0cf7d8fbb0-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb\" (UID: \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" Apr 16 19:57:18.487416 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.487342 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdfng\" (UniqueName: \"kubernetes.io/projected/b01552df-8106-4d91-b438-fb0cf7d8fbb0-kube-api-access-sdfng\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb\" (UID: \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" Apr 16 19:57:18.588108 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.588083 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdfng\" (UniqueName: \"kubernetes.io/projected/b01552df-8106-4d91-b438-fb0cf7d8fbb0-kube-api-access-sdfng\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb\" (UID: \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" Apr 16 19:57:18.588215 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.588133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b01552df-8106-4d91-b438-fb0cf7d8fbb0-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb\" (UID: \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" Apr 16 19:57:18.588215 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.588157 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b01552df-8106-4d91-b438-fb0cf7d8fbb0-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb\" (UID: \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" Apr 16 19:57:18.588475 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.588459 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b01552df-8106-4d91-b438-fb0cf7d8fbb0-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb\" (UID: \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" Apr 16 19:57:18.588521 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.588480 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b01552df-8106-4d91-b438-fb0cf7d8fbb0-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb\" (UID: \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" Apr 16 19:57:18.595961 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.595941 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdfng\" (UniqueName: \"kubernetes.io/projected/b01552df-8106-4d91-b438-fb0cf7d8fbb0-kube-api-access-sdfng\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb\" (UID: \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" Apr 16 19:57:18.707959 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.707901 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" Apr 16 19:57:18.846817 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:18.846780 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb"] Apr 16 19:57:18.849475 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:57:18.849451 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb01552df_8106_4d91_b438_fb0cf7d8fbb0.slice/crio-8b7ad90808fe676aa383a52b9ac26015d73c69251f279d7195dc5930a8782932 WatchSource:0}: Error finding container 8b7ad90808fe676aa383a52b9ac26015d73c69251f279d7195dc5930a8782932: Status 404 returned error can't find the container with id 8b7ad90808fe676aa383a52b9ac26015d73c69251f279d7195dc5930a8782932 Apr 16 19:57:19.648825 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:19.648779 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" event={"ID":"b01552df-8106-4d91-b438-fb0cf7d8fbb0","Type":"ContainerStarted","Data":"8b7ad90808fe676aa383a52b9ac26015d73c69251f279d7195dc5930a8782932"} Apr 16 19:57:26.672989 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:26.672953 2568 generic.go:358] "Generic (PLEG): container finished" podID="b01552df-8106-4d91-b438-fb0cf7d8fbb0" containerID="88fc2b363800c85f63ea6b084c4b807f2cd5116440009cfbeace41018634072b" exitCode=0 Apr 16 19:57:26.673394 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:26.673025 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" event={"ID":"b01552df-8106-4d91-b438-fb0cf7d8fbb0","Type":"ContainerDied","Data":"88fc2b363800c85f63ea6b084c4b807f2cd5116440009cfbeace41018634072b"} Apr 16 19:57:28.770808 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.770782 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6"] Apr 16 19:57:28.775190 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.775146 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6" Apr 16 19:57:28.779479 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.779439 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 19:57:28.779848 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.779453 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-hx8f5\"" Apr 16 19:57:28.779974 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.779512 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 19:57:28.779974 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.779577 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 19:57:28.780088 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.779603 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 19:57:28.781889 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.781871 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6"] Apr 16 19:57:28.856685 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.856654 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs"] Apr 16 19:57:28.859878 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.859861 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" Apr 16 19:57:28.862477 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.862458 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 19:57:28.870767 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.870746 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs"] Apr 16 19:57:28.875927 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.875900 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv5fl\" (UniqueName: \"kubernetes.io/projected/1220f482-33d6-4436-8e2c-d731a8ed2802-kube-api-access-tv5fl\") pod \"managed-serviceaccount-addon-agent-99955974c-68pm6\" (UID: \"1220f482-33d6-4436-8e2c-d731a8ed2802\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6" Apr 16 19:57:28.876009 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.875976 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1220f482-33d6-4436-8e2c-d731a8ed2802-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-99955974c-68pm6\" (UID: \"1220f482-33d6-4436-8e2c-d731a8ed2802\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6" Apr 16 19:57:28.976423 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.976397 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9aa685b-3209-4938-9d7e-482c68062a8b-tmp\") pod \"klusterlet-addon-workmgr-7c9885d575-2bpzs\" (UID: \"b9aa685b-3209-4938-9d7e-482c68062a8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" Apr 16 19:57:28.976528 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.976430 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8nqn\" (UniqueName: \"kubernetes.io/projected/b9aa685b-3209-4938-9d7e-482c68062a8b-kube-api-access-v8nqn\") pod \"klusterlet-addon-workmgr-7c9885d575-2bpzs\" (UID: \"b9aa685b-3209-4938-9d7e-482c68062a8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" Apr 16 19:57:28.976528 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.976456 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1220f482-33d6-4436-8e2c-d731a8ed2802-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-99955974c-68pm6\" (UID: \"1220f482-33d6-4436-8e2c-d731a8ed2802\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6" Apr 16 19:57:28.976602 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.976578 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tv5fl\" (UniqueName: \"kubernetes.io/projected/1220f482-33d6-4436-8e2c-d731a8ed2802-kube-api-access-tv5fl\") pod \"managed-serviceaccount-addon-agent-99955974c-68pm6\" (UID: \"1220f482-33d6-4436-8e2c-d731a8ed2802\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6" Apr 16 19:57:28.976639 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.976613 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b9aa685b-3209-4938-9d7e-482c68062a8b-klusterlet-config\") pod \"klusterlet-addon-workmgr-7c9885d575-2bpzs\" (UID: \"b9aa685b-3209-4938-9d7e-482c68062a8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" Apr 16 19:57:28.978666 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.978646 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1220f482-33d6-4436-8e2c-d731a8ed2802-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-99955974c-68pm6\" (UID: \"1220f482-33d6-4436-8e2c-d731a8ed2802\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6" Apr 16 19:57:28.984645 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:28.984621 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv5fl\" (UniqueName: \"kubernetes.io/projected/1220f482-33d6-4436-8e2c-d731a8ed2802-kube-api-access-tv5fl\") pod \"managed-serviceaccount-addon-agent-99955974c-68pm6\" (UID: \"1220f482-33d6-4436-8e2c-d731a8ed2802\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6" Apr 16 19:57:29.077401 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.077341 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b9aa685b-3209-4938-9d7e-482c68062a8b-klusterlet-config\") pod \"klusterlet-addon-workmgr-7c9885d575-2bpzs\" (UID: \"b9aa685b-3209-4938-9d7e-482c68062a8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" Apr 16 19:57:29.077401 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.077381 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9aa685b-3209-4938-9d7e-482c68062a8b-tmp\") pod \"klusterlet-addon-workmgr-7c9885d575-2bpzs\" (UID: \"b9aa685b-3209-4938-9d7e-482c68062a8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" Apr 16 19:57:29.077401 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.077397 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8nqn\" (UniqueName: \"kubernetes.io/projected/b9aa685b-3209-4938-9d7e-482c68062a8b-kube-api-access-v8nqn\") pod \"klusterlet-addon-workmgr-7c9885d575-2bpzs\" (UID: \"b9aa685b-3209-4938-9d7e-482c68062a8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" Apr 16 19:57:29.077736 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.077715 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9aa685b-3209-4938-9d7e-482c68062a8b-tmp\") pod \"klusterlet-addon-workmgr-7c9885d575-2bpzs\" (UID: \"b9aa685b-3209-4938-9d7e-482c68062a8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" Apr 16 19:57:29.079702 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.079681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b9aa685b-3209-4938-9d7e-482c68062a8b-klusterlet-config\") pod \"klusterlet-addon-workmgr-7c9885d575-2bpzs\" (UID: \"b9aa685b-3209-4938-9d7e-482c68062a8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" Apr 16 19:57:29.086630 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.086611 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8nqn\" (UniqueName: \"kubernetes.io/projected/b9aa685b-3209-4938-9d7e-482c68062a8b-kube-api-access-v8nqn\") pod \"klusterlet-addon-workmgr-7c9885d575-2bpzs\" (UID: \"b9aa685b-3209-4938-9d7e-482c68062a8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" Apr 16 19:57:29.093420 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.093406 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6" Apr 16 19:57:29.169575 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.169498 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" Apr 16 19:57:29.209672 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.209647 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6"] Apr 16 19:57:29.212117 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:57:29.212081 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1220f482_33d6_4436_8e2c_d731a8ed2802.slice/crio-a4909c18cae7bcd800e07a7ba83de4d7a06c4fdbe744d0dd2912612fb71382b9 WatchSource:0}: Error finding container a4909c18cae7bcd800e07a7ba83de4d7a06c4fdbe744d0dd2912612fb71382b9: Status 404 returned error can't find the container with id a4909c18cae7bcd800e07a7ba83de4d7a06c4fdbe744d0dd2912612fb71382b9 Apr 16 19:57:29.286132 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.286082 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs"] Apr 16 19:57:29.288339 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:57:29.288304 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9aa685b_3209_4938_9d7e_482c68062a8b.slice/crio-8d99732495bc914eb0adc887851d427a63606f57c213fa163afb2207217fec62 WatchSource:0}: Error finding container 8d99732495bc914eb0adc887851d427a63606f57c213fa163afb2207217fec62: Status 404 returned error can't find the container with id 8d99732495bc914eb0adc887851d427a63606f57c213fa163afb2207217fec62 Apr 16 19:57:29.684030 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.684001 2568 generic.go:358] "Generic (PLEG): container finished" podID="b01552df-8106-4d91-b438-fb0cf7d8fbb0" containerID="0fe0cdcebdd640bf7ee5e0edab2f7592749bdead8e9805190b7c85c705b36006" exitCode=0 Apr 16 19:57:29.684209 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.684085 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" event={"ID":"b01552df-8106-4d91-b438-fb0cf7d8fbb0","Type":"ContainerDied","Data":"0fe0cdcebdd640bf7ee5e0edab2f7592749bdead8e9805190b7c85c705b36006"} Apr 16 19:57:29.685231 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.685203 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" event={"ID":"b9aa685b-3209-4938-9d7e-482c68062a8b","Type":"ContainerStarted","Data":"8d99732495bc914eb0adc887851d427a63606f57c213fa163afb2207217fec62"} Apr 16 19:57:29.686260 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:29.686236 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6" event={"ID":"1220f482-33d6-4436-8e2c-d731a8ed2802","Type":"ContainerStarted","Data":"a4909c18cae7bcd800e07a7ba83de4d7a06c4fdbe744d0dd2912612fb71382b9"} Apr 16 19:57:32.698556 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:32.698466 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6" event={"ID":"1220f482-33d6-4436-8e2c-d731a8ed2802","Type":"ContainerStarted","Data":"fbaaf3c8097f324da46b02fcab2f76a60ed8605ab15790aea9408df67fb0d73e"} Apr 16 19:57:32.714166 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:32.714116 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-99955974c-68pm6" podStartSLOduration=1.512734033 podStartE2EDuration="4.714099356s" podCreationTimestamp="2026-04-16 19:57:28 +0000 UTC" firstStartedPulling="2026-04-16 19:57:29.214080149 +0000 UTC m=+208.890082202" lastFinishedPulling="2026-04-16 19:57:32.415445467 +0000 UTC m=+212.091447525" observedRunningTime="2026-04-16 19:57:32.713360607 +0000 UTC m=+212.389362683" watchObservedRunningTime="2026-04-16 19:57:32.714099356 +0000 UTC m=+212.390101435" Apr 16 19:57:34.704823 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:34.704790 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" event={"ID":"b9aa685b-3209-4938-9d7e-482c68062a8b","Type":"ContainerStarted","Data":"92c301d460d35ec3d0ccd33f4f742191432b9b074ff06b82fb306800169e9934"} Apr 16 19:57:35.707819 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:35.707784 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" Apr 16 19:57:35.709634 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:35.709612 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" Apr 16 19:57:35.726000 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:35.725943 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9885d575-2bpzs" podStartSLOduration=2.441107716 podStartE2EDuration="7.725925383s" podCreationTimestamp="2026-04-16 19:57:28 +0000 UTC" firstStartedPulling="2026-04-16 19:57:29.290040394 +0000 UTC m=+208.966042448" lastFinishedPulling="2026-04-16 19:57:34.574858058 +0000 UTC m=+214.250860115" observedRunningTime="2026-04-16 19:57:35.724765732 +0000 UTC m=+215.400767835" watchObservedRunningTime="2026-04-16 19:57:35.725925383 +0000 UTC m=+215.401927459" Apr 16 19:57:37.714478 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:37.714447 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" event={"ID":"b01552df-8106-4d91-b438-fb0cf7d8fbb0","Type":"ContainerStarted","Data":"22ad2a8fd5b684c8cefa531cd70e3337a6c719b896c786d16e23c026d9af2d41"} Apr 16 19:57:37.731768 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:37.731718 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" podStartSLOduration=0.957364893 podStartE2EDuration="19.731698177s" podCreationTimestamp="2026-04-16 19:57:18 +0000 UTC" firstStartedPulling="2026-04-16 19:57:18.851383237 +0000 UTC m=+198.527385291" lastFinishedPulling="2026-04-16 19:57:37.62571652 +0000 UTC m=+217.301718575" observedRunningTime="2026-04-16 19:57:37.730782694 +0000 UTC m=+217.406784771" watchObservedRunningTime="2026-04-16 19:57:37.731698177 +0000 UTC m=+217.407700250" Apr 16 19:57:38.719287 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:38.719250 2568 generic.go:358] "Generic (PLEG): container finished" podID="b01552df-8106-4d91-b438-fb0cf7d8fbb0" containerID="22ad2a8fd5b684c8cefa531cd70e3337a6c719b896c786d16e23c026d9af2d41" exitCode=0 Apr 16 19:57:38.719624 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:38.719334 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" event={"ID":"b01552df-8106-4d91-b438-fb0cf7d8fbb0","Type":"ContainerDied","Data":"22ad2a8fd5b684c8cefa531cd70e3337a6c719b896c786d16e23c026d9af2d41"} Apr 16 19:57:39.836389 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:39.836367 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" Apr 16 19:57:39.971392 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:39.971358 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdfng\" (UniqueName: \"kubernetes.io/projected/b01552df-8106-4d91-b438-fb0cf7d8fbb0-kube-api-access-sdfng\") pod \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\" (UID: \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\") " Apr 16 19:57:39.971566 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:39.971442 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b01552df-8106-4d91-b438-fb0cf7d8fbb0-bundle\") pod \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\" (UID: \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\") " Apr 16 19:57:39.971566 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:39.971477 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b01552df-8106-4d91-b438-fb0cf7d8fbb0-util\") pod \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\" (UID: \"b01552df-8106-4d91-b438-fb0cf7d8fbb0\") " Apr 16 19:57:39.972003 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:39.971969 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b01552df-8106-4d91-b438-fb0cf7d8fbb0-bundle" (OuterVolumeSpecName: "bundle") pod "b01552df-8106-4d91-b438-fb0cf7d8fbb0" (UID: "b01552df-8106-4d91-b438-fb0cf7d8fbb0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:57:39.973617 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:39.973561 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b01552df-8106-4d91-b438-fb0cf7d8fbb0-kube-api-access-sdfng" (OuterVolumeSpecName: "kube-api-access-sdfng") pod "b01552df-8106-4d91-b438-fb0cf7d8fbb0" (UID: "b01552df-8106-4d91-b438-fb0cf7d8fbb0"). InnerVolumeSpecName "kube-api-access-sdfng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:57:39.977040 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:39.977012 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b01552df-8106-4d91-b438-fb0cf7d8fbb0-util" (OuterVolumeSpecName: "util") pod "b01552df-8106-4d91-b438-fb0cf7d8fbb0" (UID: "b01552df-8106-4d91-b438-fb0cf7d8fbb0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:57:40.072807 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:40.072774 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sdfng\" (UniqueName: \"kubernetes.io/projected/b01552df-8106-4d91-b438-fb0cf7d8fbb0-kube-api-access-sdfng\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:57:40.072807 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:40.072803 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b01552df-8106-4d91-b438-fb0cf7d8fbb0-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:57:40.072977 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:40.072818 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b01552df-8106-4d91-b438-fb0cf7d8fbb0-util\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 19:57:40.728240 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:40.728214 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" Apr 16 19:57:40.728377 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:40.728209 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4z4jb" event={"ID":"b01552df-8106-4d91-b438-fb0cf7d8fbb0","Type":"ContainerDied","Data":"8b7ad90808fe676aa383a52b9ac26015d73c69251f279d7195dc5930a8782932"} Apr 16 19:57:40.728377 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:40.728326 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b7ad90808fe676aa383a52b9ac26015d73c69251f279d7195dc5930a8782932" Apr 16 19:57:50.606883 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.606851 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-nsrrm"] Apr 16 19:57:50.607366 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.607304 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b01552df-8106-4d91-b438-fb0cf7d8fbb0" containerName="pull" Apr 16 19:57:50.607366 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.607322 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01552df-8106-4d91-b438-fb0cf7d8fbb0" containerName="pull" Apr 16 19:57:50.607366 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.607345 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b01552df-8106-4d91-b438-fb0cf7d8fbb0" containerName="extract" Apr 16 19:57:50.607366 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.607352 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01552df-8106-4d91-b438-fb0cf7d8fbb0" containerName="extract" Apr 16 19:57:50.607366 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.607364 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b01552df-8106-4d91-b438-fb0cf7d8fbb0" containerName="util" Apr 16 19:57:50.607605 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.607373 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01552df-8106-4d91-b438-fb0cf7d8fbb0" containerName="util" Apr 16 19:57:50.607605 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.607447 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b01552df-8106-4d91-b438-fb0cf7d8fbb0" containerName="extract" Apr 16 19:57:50.610339 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.610318 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-nsrrm" Apr 16 19:57:50.613223 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.613200 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-fjh4h\"" Apr 16 19:57:50.613349 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.613224 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 19:57:50.613349 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.613250 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 19:57:50.614396 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.614376 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 19:57:50.614495 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.614376 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 19:57:50.617702 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.617458 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-nsrrm"] Apr 16 19:57:50.752502 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.752473 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjxl8\" (UniqueName: \"kubernetes.io/projected/4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea-kube-api-access-xjxl8\") pod \"keda-admission-cf49989db-nsrrm\" (UID: \"4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea\") " pod="openshift-keda/keda-admission-cf49989db-nsrrm" Apr 16 19:57:50.752656 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.752522 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea-certificates\") pod \"keda-admission-cf49989db-nsrrm\" (UID: \"4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea\") " pod="openshift-keda/keda-admission-cf49989db-nsrrm" Apr 16 19:57:50.853983 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.853947 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjxl8\" (UniqueName: \"kubernetes.io/projected/4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea-kube-api-access-xjxl8\") pod \"keda-admission-cf49989db-nsrrm\" (UID: \"4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea\") " pod="openshift-keda/keda-admission-cf49989db-nsrrm" Apr 16 19:57:50.854213 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.854011 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea-certificates\") pod \"keda-admission-cf49989db-nsrrm\" (UID: \"4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea\") " pod="openshift-keda/keda-admission-cf49989db-nsrrm" Apr 16 19:57:50.857205 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.857116 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea-certificates\") pod \"keda-admission-cf49989db-nsrrm\" (UID: \"4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea\") " pod="openshift-keda/keda-admission-cf49989db-nsrrm" Apr 16 19:57:50.864250 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.864229 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjxl8\" (UniqueName: \"kubernetes.io/projected/4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea-kube-api-access-xjxl8\") pod \"keda-admission-cf49989db-nsrrm\" (UID: \"4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea\") " pod="openshift-keda/keda-admission-cf49989db-nsrrm" Apr 16 19:57:50.921658 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:50.921559 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-nsrrm" Apr 16 19:57:51.065793 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:51.065766 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-nsrrm"] Apr 16 19:57:51.068004 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:57:51.067969 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f38ed3b_bfe9_49dd_bb6d_b6b1be9563ea.slice/crio-469dc3f8ed96137475b5e998fcb6a114fbd7f4f026cefe2f1cf86e563cf500f5 WatchSource:0}: Error finding container 469dc3f8ed96137475b5e998fcb6a114fbd7f4f026cefe2f1cf86e563cf500f5: Status 404 returned error can't find the container with id 469dc3f8ed96137475b5e998fcb6a114fbd7f4f026cefe2f1cf86e563cf500f5 Apr 16 19:57:51.760437 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:51.760404 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-nsrrm" event={"ID":"4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea","Type":"ContainerStarted","Data":"469dc3f8ed96137475b5e998fcb6a114fbd7f4f026cefe2f1cf86e563cf500f5"} Apr 16 19:57:52.764339 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:52.764305 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-nsrrm" event={"ID":"4f38ed3b-bfe9-49dd-bb6d-b6b1be9563ea","Type":"ContainerStarted","Data":"4b1badc3f2de322845af6bd22c56711aa250cf3d6d097e8e3b0d81406c638d65"} Apr 16 19:57:52.764810 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:52.764410 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-nsrrm" Apr 16 19:57:52.781699 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:57:52.781656 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-nsrrm" podStartSLOduration=1.512774323 podStartE2EDuration="2.781643523s" podCreationTimestamp="2026-04-16 19:57:50 +0000 UTC" firstStartedPulling="2026-04-16 19:57:51.069383871 +0000 UTC m=+230.745385927" lastFinishedPulling="2026-04-16 19:57:52.33825307 +0000 UTC m=+232.014255127" observedRunningTime="2026-04-16 19:57:52.780358113 +0000 UTC m=+232.456360189" watchObservedRunningTime="2026-04-16 19:57:52.781643523 +0000 UTC m=+232.457645600" Apr 16 19:58:13.769617 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:13.769542 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-nsrrm" Apr 16 19:58:58.890016 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:58.889981 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f"] Apr 16 19:58:58.893373 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:58.893349 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" Apr 16 19:58:58.896214 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:58.896191 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 19:58:58.896335 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:58.896285 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-rz5vd\"" Apr 16 19:58:58.897435 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:58.897416 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 19:58:58.897542 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:58.897415 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 19:58:58.905904 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:58.905884 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f"] Apr 16 19:58:58.952255 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:58.952220 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9c7327f-6025-406b-a2f1-d593ef74741e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-d8j7f\" (UID: \"e9c7327f-6025-406b-a2f1-d593ef74741e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" Apr 16 19:58:58.952402 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:58.952261 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64g2d\" (UniqueName: \"kubernetes.io/projected/e9c7327f-6025-406b-a2f1-d593ef74741e-kube-api-access-64g2d\") pod \"llmisvc-controller-manager-68cc5db7c4-d8j7f\" (UID: \"e9c7327f-6025-406b-a2f1-d593ef74741e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" Apr 16 19:58:59.053610 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:59.053579 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9c7327f-6025-406b-a2f1-d593ef74741e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-d8j7f\" (UID: \"e9c7327f-6025-406b-a2f1-d593ef74741e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" Apr 16 19:58:59.053736 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:59.053614 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64g2d\" (UniqueName: \"kubernetes.io/projected/e9c7327f-6025-406b-a2f1-d593ef74741e-kube-api-access-64g2d\") pod \"llmisvc-controller-manager-68cc5db7c4-d8j7f\" (UID: \"e9c7327f-6025-406b-a2f1-d593ef74741e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" Apr 16 19:58:59.053736 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:58:59.053692 2568 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 19:58:59.053810 ip-10-0-140-191 kubenswrapper[2568]: E0416 19:58:59.053758 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c7327f-6025-406b-a2f1-d593ef74741e-cert podName:e9c7327f-6025-406b-a2f1-d593ef74741e nodeName:}" failed. No retries permitted until 2026-04-16 19:58:59.553742664 +0000 UTC m=+299.229744717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9c7327f-6025-406b-a2f1-d593ef74741e-cert") pod "llmisvc-controller-manager-68cc5db7c4-d8j7f" (UID: "e9c7327f-6025-406b-a2f1-d593ef74741e") : secret "llmisvc-webhook-server-cert" not found Apr 16 19:58:59.066500 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:59.066470 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64g2d\" (UniqueName: \"kubernetes.io/projected/e9c7327f-6025-406b-a2f1-d593ef74741e-kube-api-access-64g2d\") pod \"llmisvc-controller-manager-68cc5db7c4-d8j7f\" (UID: \"e9c7327f-6025-406b-a2f1-d593ef74741e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" Apr 16 19:58:59.558629 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:59.558597 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9c7327f-6025-406b-a2f1-d593ef74741e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-d8j7f\" (UID: \"e9c7327f-6025-406b-a2f1-d593ef74741e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" Apr 16 19:58:59.560894 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:59.560877 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9c7327f-6025-406b-a2f1-d593ef74741e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-d8j7f\" (UID: \"e9c7327f-6025-406b-a2f1-d593ef74741e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" Apr 16 19:58:59.803465 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:59.803435 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" Apr 16 19:58:59.922999 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:59.922972 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f"] Apr 16 19:58:59.923844 ip-10-0-140-191 kubenswrapper[2568]: W0416 19:58:59.923814 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode9c7327f_6025_406b_a2f1_d593ef74741e.slice/crio-c98c75b343dbdb86a257934044cbd47624cdf2ad516d7dc0799c15c27900a77e WatchSource:0}: Error finding container c98c75b343dbdb86a257934044cbd47624cdf2ad516d7dc0799c15c27900a77e: Status 404 returned error can't find the container with id c98c75b343dbdb86a257934044cbd47624cdf2ad516d7dc0799c15c27900a77e Apr 16 19:58:59.951380 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:58:59.951352 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" event={"ID":"e9c7327f-6025-406b-a2f1-d593ef74741e","Type":"ContainerStarted","Data":"c98c75b343dbdb86a257934044cbd47624cdf2ad516d7dc0799c15c27900a77e"} Apr 16 19:59:01.037552 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:59:01.037521 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 19:59:01.959591 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:59:01.959515 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" event={"ID":"e9c7327f-6025-406b-a2f1-d593ef74741e","Type":"ContainerStarted","Data":"a51453bc19ffa6e52f57ac008e8f673031a49b55bec5d16d3ab6642f932b22e2"} Apr 16 19:59:01.959717 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:59:01.959654 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" Apr 16 19:59:01.979409 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:59:01.979354 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" podStartSLOduration=2.212424853 podStartE2EDuration="3.979337491s" podCreationTimestamp="2026-04-16 19:58:58 +0000 UTC" firstStartedPulling="2026-04-16 19:58:59.925060124 +0000 UTC m=+299.601062177" lastFinishedPulling="2026-04-16 19:59:01.691972761 +0000 UTC m=+301.367974815" observedRunningTime="2026-04-16 19:59:01.978466434 +0000 UTC m=+301.654468510" watchObservedRunningTime="2026-04-16 19:59:01.979337491 +0000 UTC m=+301.655339560" Apr 16 19:59:32.965273 ip-10-0-140-191 kubenswrapper[2568]: I0416 19:59:32.965245 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-d8j7f" Apr 16 20:00:07.989198 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:07.989139 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-ttt7q"] Apr 16 20:00:07.992474 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:07.992451 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-ttt7q" Apr 16 20:00:07.995358 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:07.995333 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 20:00:07.995358 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:07.995339 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-r6k9k\"" Apr 16 20:00:08.002096 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:08.002064 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-ttt7q"] Apr 16 20:00:08.075085 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:08.075057 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f090f732-3d0f-4172-812d-a4b0a0370733-cert\") pod \"odh-model-controller-696fc77849-ttt7q\" (UID: \"f090f732-3d0f-4172-812d-a4b0a0370733\") " pod="kserve/odh-model-controller-696fc77849-ttt7q" Apr 16 20:00:08.075254 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:08.075127 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-722x4\" (UniqueName: \"kubernetes.io/projected/f090f732-3d0f-4172-812d-a4b0a0370733-kube-api-access-722x4\") pod \"odh-model-controller-696fc77849-ttt7q\" (UID: \"f090f732-3d0f-4172-812d-a4b0a0370733\") " pod="kserve/odh-model-controller-696fc77849-ttt7q" Apr 16 20:00:08.175999 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:08.175953 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-722x4\" (UniqueName: \"kubernetes.io/projected/f090f732-3d0f-4172-812d-a4b0a0370733-kube-api-access-722x4\") pod \"odh-model-controller-696fc77849-ttt7q\" (UID: \"f090f732-3d0f-4172-812d-a4b0a0370733\") " pod="kserve/odh-model-controller-696fc77849-ttt7q" Apr 16 20:00:08.176199 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:08.176029 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f090f732-3d0f-4172-812d-a4b0a0370733-cert\") pod \"odh-model-controller-696fc77849-ttt7q\" (UID: \"f090f732-3d0f-4172-812d-a4b0a0370733\") " pod="kserve/odh-model-controller-696fc77849-ttt7q" Apr 16 20:00:08.178348 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:08.178328 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f090f732-3d0f-4172-812d-a4b0a0370733-cert\") pod \"odh-model-controller-696fc77849-ttt7q\" (UID: \"f090f732-3d0f-4172-812d-a4b0a0370733\") " pod="kserve/odh-model-controller-696fc77849-ttt7q" Apr 16 20:00:08.185506 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:08.185486 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-722x4\" (UniqueName: \"kubernetes.io/projected/f090f732-3d0f-4172-812d-a4b0a0370733-kube-api-access-722x4\") pod \"odh-model-controller-696fc77849-ttt7q\" (UID: \"f090f732-3d0f-4172-812d-a4b0a0370733\") " pod="kserve/odh-model-controller-696fc77849-ttt7q" Apr 16 20:00:08.304112 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:08.304085 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-ttt7q" Apr 16 20:00:08.424154 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:08.424128 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-ttt7q"] Apr 16 20:00:08.426488 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:00:08.426462 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf090f732_3d0f_4172_812d_a4b0a0370733.slice/crio-3a71be237644692483f221378e4607931d41b16ad82f6053046bf8bd5a24b191 WatchSource:0}: Error finding container 3a71be237644692483f221378e4607931d41b16ad82f6053046bf8bd5a24b191: Status 404 returned error can't find the container with id 3a71be237644692483f221378e4607931d41b16ad82f6053046bf8bd5a24b191 Apr 16 20:00:08.427676 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:08.427659 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:00:09.142111 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:09.142071 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-ttt7q" event={"ID":"f090f732-3d0f-4172-812d-a4b0a0370733","Type":"ContainerStarted","Data":"3a71be237644692483f221378e4607931d41b16ad82f6053046bf8bd5a24b191"} Apr 16 20:00:12.153022 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:12.152924 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-ttt7q" event={"ID":"f090f732-3d0f-4172-812d-a4b0a0370733","Type":"ContainerStarted","Data":"8dc154f5f95269c0a144db02cf5fbab37a6505e49d38c7f01fae19a9dd899140"} Apr 16 20:00:12.153022 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:12.152998 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-ttt7q" Apr 16 20:00:12.171070 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:12.171020 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-ttt7q" podStartSLOduration=1.827417991 podStartE2EDuration="5.171007743s" podCreationTimestamp="2026-04-16 20:00:07 +0000 UTC" firstStartedPulling="2026-04-16 20:00:08.427787338 +0000 UTC m=+368.103789395" lastFinishedPulling="2026-04-16 20:00:11.771377093 +0000 UTC m=+371.447379147" observedRunningTime="2026-04-16 20:00:12.16979711 +0000 UTC m=+371.845799215" watchObservedRunningTime="2026-04-16 20:00:12.171007743 +0000 UTC m=+371.847009818" Apr 16 20:00:22.698430 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.698398 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69f97474bd-lch2q"] Apr 16 20:00:22.701654 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.701634 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.712315 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.712291 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69f97474bd-lch2q"] Apr 16 20:00:22.788261 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.788237 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-console-config\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.788398 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.788266 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zm5b\" (UniqueName: \"kubernetes.io/projected/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-kube-api-access-6zm5b\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.788398 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.788298 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-oauth-serving-cert\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.788398 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.788376 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-console-oauth-config\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.788737 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.788432 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-trusted-ca-bundle\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.788737 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.788460 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-console-serving-cert\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.788737 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.788492 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-service-ca\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.889076 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.889051 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-trusted-ca-bundle\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.889216 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.889086 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-console-serving-cert\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.889216 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.889114 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-service-ca\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.889216 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.889189 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-console-config\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.889378 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.889298 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zm5b\" (UniqueName: \"kubernetes.io/projected/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-kube-api-access-6zm5b\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.889378 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.889341 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-oauth-serving-cert\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.889378 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.889366 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-console-oauth-config\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.889948 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.889927 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-trusted-ca-bundle\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.890051 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.889963 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-service-ca\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.890051 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.889977 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-console-config\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.890051 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.890033 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-oauth-serving-cert\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.891596 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.891577 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-console-oauth-config\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.891596 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.891589 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-console-serving-cert\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:22.897407 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:22.897385 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zm5b\" (UniqueName: \"kubernetes.io/projected/00287a79-a46a-4dd3-bfcd-31bd74b8cf70-kube-api-access-6zm5b\") pod \"console-69f97474bd-lch2q\" (UID: \"00287a79-a46a-4dd3-bfcd-31bd74b8cf70\") " pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:23.010588 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:23.010561 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:23.131538 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:23.131514 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69f97474bd-lch2q"] Apr 16 20:00:23.133584 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:00:23.133558 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00287a79_a46a_4dd3_bfcd_31bd74b8cf70.slice/crio-1bc197d560f2c618361e0665f322d996b3f1e21b7c72cb89bdbd08a611a0c592 WatchSource:0}: Error finding container 1bc197d560f2c618361e0665f322d996b3f1e21b7c72cb89bdbd08a611a0c592: Status 404 returned error can't find the container with id 1bc197d560f2c618361e0665f322d996b3f1e21b7c72cb89bdbd08a611a0c592 Apr 16 20:00:23.158110 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:23.158089 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-ttt7q" Apr 16 20:00:23.186032 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:23.186002 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f97474bd-lch2q" event={"ID":"00287a79-a46a-4dd3-bfcd-31bd74b8cf70","Type":"ContainerStarted","Data":"1bc197d560f2c618361e0665f322d996b3f1e21b7c72cb89bdbd08a611a0c592"} Apr 16 20:00:23.985302 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:23.985271 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-96bbb"] Apr 16 20:00:23.988462 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:23.988445 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-96bbb" Apr 16 20:00:23.990882 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:23.990860 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 20:00:23.991010 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:23.990993 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-xf4q2\"" Apr 16 20:00:23.994412 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:23.994394 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-96bbb"] Apr 16 20:00:24.100401 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:24.100374 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mffnv\" (UniqueName: \"kubernetes.io/projected/9bf30547-62ef-4eea-b806-6a6eabe22ef3-kube-api-access-mffnv\") pod \"s3-init-96bbb\" (UID: \"9bf30547-62ef-4eea-b806-6a6eabe22ef3\") " pod="kserve/s3-init-96bbb" Apr 16 20:00:24.190302 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:24.190272 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f97474bd-lch2q" event={"ID":"00287a79-a46a-4dd3-bfcd-31bd74b8cf70","Type":"ContainerStarted","Data":"21288d05336b2a040e17e5146dc0f4f19bc4a680c451263c50fe93ecd158c7be"} Apr 16 20:00:24.200888 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:24.200864 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mffnv\" (UniqueName: \"kubernetes.io/projected/9bf30547-62ef-4eea-b806-6a6eabe22ef3-kube-api-access-mffnv\") pod \"s3-init-96bbb\" (UID: \"9bf30547-62ef-4eea-b806-6a6eabe22ef3\") " pod="kserve/s3-init-96bbb" Apr 16 20:00:24.211140 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:24.211101 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69f97474bd-lch2q" podStartSLOduration=2.211088004 podStartE2EDuration="2.211088004s" podCreationTimestamp="2026-04-16 20:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:00:24.209598219 +0000 UTC m=+383.885600299" watchObservedRunningTime="2026-04-16 20:00:24.211088004 +0000 UTC m=+383.887090080" Apr 16 20:00:24.214519 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:24.214500 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mffnv\" (UniqueName: \"kubernetes.io/projected/9bf30547-62ef-4eea-b806-6a6eabe22ef3-kube-api-access-mffnv\") pod \"s3-init-96bbb\" (UID: \"9bf30547-62ef-4eea-b806-6a6eabe22ef3\") " pod="kserve/s3-init-96bbb" Apr 16 20:00:24.298272 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:24.298241 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-96bbb" Apr 16 20:00:24.414692 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:24.414666 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-96bbb"] Apr 16 20:00:24.416086 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:00:24.416054 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bf30547_62ef_4eea_b806_6a6eabe22ef3.slice/crio-6d67a01adaeed2ab5031182d8e6b8e91df43ebf2d4a645f49f6ed2e965981d49 WatchSource:0}: Error finding container 6d67a01adaeed2ab5031182d8e6b8e91df43ebf2d4a645f49f6ed2e965981d49: Status 404 returned error can't find the container with id 6d67a01adaeed2ab5031182d8e6b8e91df43ebf2d4a645f49f6ed2e965981d49 Apr 16 20:00:25.194388 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:25.194355 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-96bbb" event={"ID":"9bf30547-62ef-4eea-b806-6a6eabe22ef3","Type":"ContainerStarted","Data":"6d67a01adaeed2ab5031182d8e6b8e91df43ebf2d4a645f49f6ed2e965981d49"} Apr 16 20:00:30.212686 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:30.212650 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-96bbb" event={"ID":"9bf30547-62ef-4eea-b806-6a6eabe22ef3","Type":"ContainerStarted","Data":"b16600782d97cf4ba386f29ea9f254166e8c2c50ef58d869da9d7745e878f8b5"} Apr 16 20:00:30.229012 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:30.228973 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-96bbb" podStartSLOduration=2.214002898 podStartE2EDuration="7.228959078s" podCreationTimestamp="2026-04-16 20:00:23 +0000 UTC" firstStartedPulling="2026-04-16 20:00:24.418109401 +0000 UTC m=+384.094111455" lastFinishedPulling="2026-04-16 20:00:29.433065578 +0000 UTC m=+389.109067635" observedRunningTime="2026-04-16 20:00:30.227859751 +0000 UTC m=+389.903862032" watchObservedRunningTime="2026-04-16 20:00:30.228959078 +0000 UTC m=+389.904961154" Apr 16 20:00:33.010934 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:33.010901 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:33.011330 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:33.010945 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:33.015329 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:33.015305 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:33.223360 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:33.223327 2568 generic.go:358] "Generic (PLEG): container finished" podID="9bf30547-62ef-4eea-b806-6a6eabe22ef3" containerID="b16600782d97cf4ba386f29ea9f254166e8c2c50ef58d869da9d7745e878f8b5" exitCode=0 Apr 16 20:00:33.223520 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:33.223408 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-96bbb" event={"ID":"9bf30547-62ef-4eea-b806-6a6eabe22ef3","Type":"ContainerDied","Data":"b16600782d97cf4ba386f29ea9f254166e8c2c50ef58d869da9d7745e878f8b5"} Apr 16 20:00:33.227246 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:33.227222 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69f97474bd-lch2q" Apr 16 20:00:33.315646 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:33.315572 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c68c599bb-zmf8r"] Apr 16 20:00:34.347235 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:34.347213 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-96bbb" Apr 16 20:00:34.494716 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:34.494649 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mffnv\" (UniqueName: \"kubernetes.io/projected/9bf30547-62ef-4eea-b806-6a6eabe22ef3-kube-api-access-mffnv\") pod \"9bf30547-62ef-4eea-b806-6a6eabe22ef3\" (UID: \"9bf30547-62ef-4eea-b806-6a6eabe22ef3\") " Apr 16 20:00:34.496737 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:34.496709 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bf30547-62ef-4eea-b806-6a6eabe22ef3-kube-api-access-mffnv" (OuterVolumeSpecName: "kube-api-access-mffnv") pod "9bf30547-62ef-4eea-b806-6a6eabe22ef3" (UID: "9bf30547-62ef-4eea-b806-6a6eabe22ef3"). InnerVolumeSpecName "kube-api-access-mffnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:00:34.595630 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:34.595605 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mffnv\" (UniqueName: \"kubernetes.io/projected/9bf30547-62ef-4eea-b806-6a6eabe22ef3-kube-api-access-mffnv\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:00:35.230146 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:35.230080 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-96bbb" Apr 16 20:00:35.230146 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:35.230091 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-96bbb" event={"ID":"9bf30547-62ef-4eea-b806-6a6eabe22ef3","Type":"ContainerDied","Data":"6d67a01adaeed2ab5031182d8e6b8e91df43ebf2d4a645f49f6ed2e965981d49"} Apr 16 20:00:35.230146 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:35.230116 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d67a01adaeed2ab5031182d8e6b8e91df43ebf2d4a645f49f6ed2e965981d49" Apr 16 20:00:44.812114 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:44.812081 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v"] Apr 16 20:00:44.812715 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:44.812568 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9bf30547-62ef-4eea-b806-6a6eabe22ef3" containerName="s3-init" Apr 16 20:00:44.812715 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:44.812589 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bf30547-62ef-4eea-b806-6a6eabe22ef3" containerName="s3-init" Apr 16 20:00:44.812715 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:44.812697 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9bf30547-62ef-4eea-b806-6a6eabe22ef3" containerName="s3-init" Apr 16 20:00:44.864659 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:44.864624 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v"] Apr 16 20:00:44.864785 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:44.864683 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" Apr 16 20:00:44.867531 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:44.867512 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jlhx9\"" Apr 16 20:00:44.955734 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:44.955708 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p"] Apr 16 20:00:44.958035 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:44.958015 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" Apr 16 20:00:44.970357 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:44.970337 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p"] Apr 16 20:00:44.975571 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:44.975548 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3264742a-0448-4b74-89ac-1eaae6425d2e-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-x9n2v\" (UID: \"3264742a-0448-4b74-89ac-1eaae6425d2e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" Apr 16 20:00:45.077060 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:45.076983 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fde0b41-7529-4d1e-8fab-c2bbff747b35-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-d76b874f9-v922p\" (UID: \"6fde0b41-7529-4d1e-8fab-c2bbff747b35\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" Apr 16 20:00:45.077215 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:45.077080 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3264742a-0448-4b74-89ac-1eaae6425d2e-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-x9n2v\" (UID: \"3264742a-0448-4b74-89ac-1eaae6425d2e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" Apr 16 20:00:45.077538 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:45.077518 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3264742a-0448-4b74-89ac-1eaae6425d2e-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-x9n2v\" (UID: \"3264742a-0448-4b74-89ac-1eaae6425d2e\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" Apr 16 20:00:45.174136 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:45.174112 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" Apr 16 20:00:45.178283 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:45.178263 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fde0b41-7529-4d1e-8fab-c2bbff747b35-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-d76b874f9-v922p\" (UID: \"6fde0b41-7529-4d1e-8fab-c2bbff747b35\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" Apr 16 20:00:45.178679 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:45.178656 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fde0b41-7529-4d1e-8fab-c2bbff747b35-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-d76b874f9-v922p\" (UID: \"6fde0b41-7529-4d1e-8fab-c2bbff747b35\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" Apr 16 20:00:45.267548 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:45.267522 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" Apr 16 20:00:45.297434 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:45.297397 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v"] Apr 16 20:00:45.298852 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:00:45.298814 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3264742a_0448_4b74_89ac_1eaae6425d2e.slice/crio-961c4b1d0a5a55315323ada72ebc962c7d57d7ea741ff6730734926ab5bdadce WatchSource:0}: Error finding container 961c4b1d0a5a55315323ada72ebc962c7d57d7ea741ff6730734926ab5bdadce: Status 404 returned error can't find the container with id 961c4b1d0a5a55315323ada72ebc962c7d57d7ea741ff6730734926ab5bdadce Apr 16 20:00:45.390987 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:45.390954 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p"] Apr 16 20:00:45.393818 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:00:45.393785 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fde0b41_7529_4d1e_8fab_c2bbff747b35.slice/crio-2e112797a0838b28d9a84c085156c74021fee38c6701970779270c833a264b98 WatchSource:0}: Error finding container 2e112797a0838b28d9a84c085156c74021fee38c6701970779270c833a264b98: Status 404 returned error can't find the container with id 2e112797a0838b28d9a84c085156c74021fee38c6701970779270c833a264b98 Apr 16 20:00:46.264033 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:46.263972 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" event={"ID":"6fde0b41-7529-4d1e-8fab-c2bbff747b35","Type":"ContainerStarted","Data":"2e112797a0838b28d9a84c085156c74021fee38c6701970779270c833a264b98"} Apr 16 20:00:46.265946 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:46.265916 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" event={"ID":"3264742a-0448-4b74-89ac-1eaae6425d2e","Type":"ContainerStarted","Data":"961c4b1d0a5a55315323ada72ebc962c7d57d7ea741ff6730734926ab5bdadce"} Apr 16 20:00:49.278578 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:49.278537 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" event={"ID":"6fde0b41-7529-4d1e-8fab-c2bbff747b35","Type":"ContainerStarted","Data":"ea0a9d61dde539030a99b7fc43bc37c49df97e12e7e03b632aa0fd760cff0d9a"} Apr 16 20:00:49.280056 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:49.280032 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" event={"ID":"3264742a-0448-4b74-89ac-1eaae6425d2e","Type":"ContainerStarted","Data":"065aed748b91bf06c3153c8fe43af58ad55b124d8ab5b80e73a0c4e9863bb2bb"} Apr 16 20:00:53.294999 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:53.294961 2568 generic.go:358] "Generic (PLEG): container finished" podID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerID="ea0a9d61dde539030a99b7fc43bc37c49df97e12e7e03b632aa0fd760cff0d9a" exitCode=0 Apr 16 20:00:53.295455 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:53.295044 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" event={"ID":"6fde0b41-7529-4d1e-8fab-c2bbff747b35","Type":"ContainerDied","Data":"ea0a9d61dde539030a99b7fc43bc37c49df97e12e7e03b632aa0fd760cff0d9a"} Apr 16 20:00:53.296588 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:53.296484 2568 generic.go:358] "Generic (PLEG): container finished" podID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerID="065aed748b91bf06c3153c8fe43af58ad55b124d8ab5b80e73a0c4e9863bb2bb" exitCode=0 Apr 16 20:00:53.296588 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:53.296518 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" event={"ID":"3264742a-0448-4b74-89ac-1eaae6425d2e","Type":"ContainerDied","Data":"065aed748b91bf06c3153c8fe43af58ad55b124d8ab5b80e73a0c4e9863bb2bb"} Apr 16 20:00:58.334750 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.334697 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c68c599bb-zmf8r" podUID="7dc19d41-37dc-4185-a862-b53f4d19cd10" containerName="console" containerID="cri-o://1c1b438d514db0bfad03248bb9c599e1535dd5f7095fff76c5874731efd778db" gracePeriod=15 Apr 16 20:00:58.620979 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.620948 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c68c599bb-zmf8r_7dc19d41-37dc-4185-a862-b53f4d19cd10/console/0.log" Apr 16 20:00:58.621103 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.621026 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 20:00:58.799599 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.799558 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-config\") pod \"7dc19d41-37dc-4185-a862-b53f4d19cd10\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " Apr 16 20:00:58.799780 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.799628 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-oauth-config\") pod \"7dc19d41-37dc-4185-a862-b53f4d19cd10\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " Apr 16 20:00:58.799780 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.799670 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-oauth-serving-cert\") pod \"7dc19d41-37dc-4185-a862-b53f4d19cd10\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " Apr 16 20:00:58.799780 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.799720 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-serving-cert\") pod \"7dc19d41-37dc-4185-a862-b53f4d19cd10\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " Apr 16 20:00:58.799780 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.799756 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s98mv\" (UniqueName: \"kubernetes.io/projected/7dc19d41-37dc-4185-a862-b53f4d19cd10-kube-api-access-s98mv\") pod \"7dc19d41-37dc-4185-a862-b53f4d19cd10\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " Apr 16 20:00:58.799996 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.799802 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-service-ca\") pod \"7dc19d41-37dc-4185-a862-b53f4d19cd10\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " Apr 16 20:00:58.799996 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.799827 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-trusted-ca-bundle\") pod \"7dc19d41-37dc-4185-a862-b53f4d19cd10\" (UID: \"7dc19d41-37dc-4185-a862-b53f4d19cd10\") " Apr 16 20:00:58.800099 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.800038 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7dc19d41-37dc-4185-a862-b53f4d19cd10" (UID: "7dc19d41-37dc-4185-a862-b53f4d19cd10"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:00:58.800391 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.800369 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7dc19d41-37dc-4185-a862-b53f4d19cd10" (UID: "7dc19d41-37dc-4185-a862-b53f4d19cd10"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:00:58.800438 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.800404 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-service-ca" (OuterVolumeSpecName: "service-ca") pod "7dc19d41-37dc-4185-a862-b53f4d19cd10" (UID: "7dc19d41-37dc-4185-a862-b53f4d19cd10"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:00:58.800699 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.800641 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-config" (OuterVolumeSpecName: "console-config") pod "7dc19d41-37dc-4185-a862-b53f4d19cd10" (UID: "7dc19d41-37dc-4185-a862-b53f4d19cd10"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:00:58.803317 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.803287 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dc19d41-37dc-4185-a862-b53f4d19cd10-kube-api-access-s98mv" (OuterVolumeSpecName: "kube-api-access-s98mv") pod "7dc19d41-37dc-4185-a862-b53f4d19cd10" (UID: "7dc19d41-37dc-4185-a862-b53f4d19cd10"). InnerVolumeSpecName "kube-api-access-s98mv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:00:58.804530 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.804497 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7dc19d41-37dc-4185-a862-b53f4d19cd10" (UID: "7dc19d41-37dc-4185-a862-b53f4d19cd10"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:00:58.805556 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.805523 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7dc19d41-37dc-4185-a862-b53f4d19cd10" (UID: "7dc19d41-37dc-4185-a862-b53f4d19cd10"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:00:58.900575 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.900495 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s98mv\" (UniqueName: \"kubernetes.io/projected/7dc19d41-37dc-4185-a862-b53f4d19cd10-kube-api-access-s98mv\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:00:58.900575 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.900532 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-service-ca\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:00:58.900575 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.900549 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-trusted-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:00:58.900575 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.900565 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-config\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:00:58.900575 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.900579 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-oauth-config\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:00:58.900948 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.900593 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7dc19d41-37dc-4185-a862-b53f4d19cd10-oauth-serving-cert\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:00:58.900948 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:58.900609 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc19d41-37dc-4185-a862-b53f4d19cd10-console-serving-cert\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:00:59.332728 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:59.332697 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c68c599bb-zmf8r_7dc19d41-37dc-4185-a862-b53f4d19cd10/console/0.log" Apr 16 20:00:59.332925 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:59.332744 2568 generic.go:358] "Generic (PLEG): container finished" podID="7dc19d41-37dc-4185-a862-b53f4d19cd10" containerID="1c1b438d514db0bfad03248bb9c599e1535dd5f7095fff76c5874731efd778db" exitCode=2 Apr 16 20:00:59.332925 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:59.332874 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c68c599bb-zmf8r" event={"ID":"7dc19d41-37dc-4185-a862-b53f4d19cd10","Type":"ContainerDied","Data":"1c1b438d514db0bfad03248bb9c599e1535dd5f7095fff76c5874731efd778db"} Apr 16 20:00:59.332925 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:59.332902 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c68c599bb-zmf8r" event={"ID":"7dc19d41-37dc-4185-a862-b53f4d19cd10","Type":"ContainerDied","Data":"dac30c4610a75e86478cbae560b8c9cd92e194b2d9f864b83272be6c9b63fa71"} Apr 16 20:00:59.332925 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:59.332921 2568 scope.go:117] "RemoveContainer" containerID="1c1b438d514db0bfad03248bb9c599e1535dd5f7095fff76c5874731efd778db" Apr 16 20:00:59.333153 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:59.332919 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c68c599bb-zmf8r" Apr 16 20:00:59.346757 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:59.346527 2568 scope.go:117] "RemoveContainer" containerID="1c1b438d514db0bfad03248bb9c599e1535dd5f7095fff76c5874731efd778db" Apr 16 20:00:59.347062 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:00:59.346792 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c1b438d514db0bfad03248bb9c599e1535dd5f7095fff76c5874731efd778db\": container with ID starting with 1c1b438d514db0bfad03248bb9c599e1535dd5f7095fff76c5874731efd778db not found: ID does not exist" containerID="1c1b438d514db0bfad03248bb9c599e1535dd5f7095fff76c5874731efd778db" Apr 16 20:00:59.347062 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:59.346827 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c1b438d514db0bfad03248bb9c599e1535dd5f7095fff76c5874731efd778db"} err="failed to get container status \"1c1b438d514db0bfad03248bb9c599e1535dd5f7095fff76c5874731efd778db\": rpc error: code = NotFound desc = could not find container \"1c1b438d514db0bfad03248bb9c599e1535dd5f7095fff76c5874731efd778db\": container with ID starting with 1c1b438d514db0bfad03248bb9c599e1535dd5f7095fff76c5874731efd778db not found: ID does not exist" Apr 16 20:00:59.364240 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:59.364196 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c68c599bb-zmf8r"] Apr 16 20:00:59.380525 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:00:59.380483 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c68c599bb-zmf8r"] Apr 16 20:01:00.899642 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:00.899475 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dc19d41-37dc-4185-a862-b53f4d19cd10" path="/var/lib/kubelet/pods/7dc19d41-37dc-4185-a862-b53f4d19cd10/volumes" Apr 16 20:01:08.368666 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:08.368584 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" event={"ID":"6fde0b41-7529-4d1e-8fab-c2bbff747b35","Type":"ContainerStarted","Data":"c8251fe1f4f45958630cfe807c299d9a64f9dabc17c4472689c6ffaa80d197d6"} Apr 16 20:01:08.369059 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:08.368965 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" Apr 16 20:01:08.370356 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:08.370312 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:01:08.388352 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:08.388309 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" podStartSLOduration=1.6943835360000001 podStartE2EDuration="24.388298307s" podCreationTimestamp="2026-04-16 20:00:44 +0000 UTC" firstStartedPulling="2026-04-16 20:00:45.395733777 +0000 UTC m=+405.071735834" lastFinishedPulling="2026-04-16 20:01:08.089648544 +0000 UTC m=+427.765650605" observedRunningTime="2026-04-16 20:01:08.38673898 +0000 UTC m=+428.062741067" watchObservedRunningTime="2026-04-16 20:01:08.388298307 +0000 UTC m=+428.064300383" Apr 16 20:01:09.372597 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:09.372559 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:01:19.373021 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:19.372926 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:01:27.434568 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:27.434474 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" event={"ID":"3264742a-0448-4b74-89ac-1eaae6425d2e","Type":"ContainerStarted","Data":"3b087c726cdbab5a58fea9a864fac9a3d139a596f30561e6ddb7ba42479c2edd"} Apr 16 20:01:27.434940 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:27.434761 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" Apr 16 20:01:27.435958 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:27.435932 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 20:01:27.453666 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:27.453621 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" podStartSLOduration=1.657676907 podStartE2EDuration="43.453607811s" podCreationTimestamp="2026-04-16 20:00:44 +0000 UTC" firstStartedPulling="2026-04-16 20:00:45.300553164 +0000 UTC m=+404.976555218" lastFinishedPulling="2026-04-16 20:01:27.096484064 +0000 UTC m=+446.772486122" observedRunningTime="2026-04-16 20:01:27.45227347 +0000 UTC m=+447.128275584" watchObservedRunningTime="2026-04-16 20:01:27.453607811 +0000 UTC m=+447.129609866" Apr 16 20:01:28.438034 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:28.437996 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 20:01:29.372903 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:29.372860 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:01:38.438342 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:38.438300 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 20:01:39.373028 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:39.372989 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:01:48.438054 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:48.438005 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 20:01:49.373475 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:49.373434 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:01:58.438234 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:58.438187 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 20:01:59.373289 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:01:59.373249 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:02:04.366749 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.366715 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx"] Apr 16 20:02:04.367104 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.367030 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dc19d41-37dc-4185-a862-b53f4d19cd10" containerName="console" Apr 16 20:02:04.367104 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.367040 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc19d41-37dc-4185-a862-b53f4d19cd10" containerName="console" Apr 16 20:02:04.367216 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.367105 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7dc19d41-37dc-4185-a862-b53f4d19cd10" containerName="console" Apr 16 20:02:04.395261 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.395232 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx"] Apr 16 20:02:04.395408 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.395343 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:04.398040 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.398016 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-1bf21-kube-rbac-proxy-sar-config\"" Apr 16 20:02:04.398284 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.398268 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-1bf21-serving-cert\"" Apr 16 20:02:04.398382 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.398281 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:02:04.437859 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.437830 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c738934a-b6dc-4ce0-9172-802752c19936-openshift-service-ca-bundle\") pod \"switch-graph-1bf21-b5cd598bc-pm5nx\" (UID: \"c738934a-b6dc-4ce0-9172-802752c19936\") " pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:04.437972 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.437888 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c738934a-b6dc-4ce0-9172-802752c19936-proxy-tls\") pod \"switch-graph-1bf21-b5cd598bc-pm5nx\" (UID: \"c738934a-b6dc-4ce0-9172-802752c19936\") " pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:04.538428 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.538401 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c738934a-b6dc-4ce0-9172-802752c19936-openshift-service-ca-bundle\") pod \"switch-graph-1bf21-b5cd598bc-pm5nx\" (UID: \"c738934a-b6dc-4ce0-9172-802752c19936\") " pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:04.538530 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.538450 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c738934a-b6dc-4ce0-9172-802752c19936-proxy-tls\") pod \"switch-graph-1bf21-b5cd598bc-pm5nx\" (UID: \"c738934a-b6dc-4ce0-9172-802752c19936\") " pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:04.538575 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:02:04.538534 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-1bf21-serving-cert: secret "switch-graph-1bf21-serving-cert" not found Apr 16 20:02:04.538608 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:02:04.538594 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c738934a-b6dc-4ce0-9172-802752c19936-proxy-tls podName:c738934a-b6dc-4ce0-9172-802752c19936 nodeName:}" failed. No retries permitted until 2026-04-16 20:02:05.03857756 +0000 UTC m=+484.714579618 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c738934a-b6dc-4ce0-9172-802752c19936-proxy-tls") pod "switch-graph-1bf21-b5cd598bc-pm5nx" (UID: "c738934a-b6dc-4ce0-9172-802752c19936") : secret "switch-graph-1bf21-serving-cert" not found Apr 16 20:02:04.538975 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:04.538956 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c738934a-b6dc-4ce0-9172-802752c19936-openshift-service-ca-bundle\") pod \"switch-graph-1bf21-b5cd598bc-pm5nx\" (UID: \"c738934a-b6dc-4ce0-9172-802752c19936\") " pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:05.042402 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:05.042373 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c738934a-b6dc-4ce0-9172-802752c19936-proxy-tls\") pod \"switch-graph-1bf21-b5cd598bc-pm5nx\" (UID: \"c738934a-b6dc-4ce0-9172-802752c19936\") " pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:05.044684 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:05.044657 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c738934a-b6dc-4ce0-9172-802752c19936-proxy-tls\") pod \"switch-graph-1bf21-b5cd598bc-pm5nx\" (UID: \"c738934a-b6dc-4ce0-9172-802752c19936\") " pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:05.305224 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:05.305124 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:05.425847 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:05.425824 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx"] Apr 16 20:02:05.427992 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:02:05.427956 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc738934a_b6dc_4ce0_9172_802752c19936.slice/crio-1716f2ea2a4ec96f4668cdf037cc364244761239f79665cd8f1794f3e13ab280 WatchSource:0}: Error finding container 1716f2ea2a4ec96f4668cdf037cc364244761239f79665cd8f1794f3e13ab280: Status 404 returned error can't find the container with id 1716f2ea2a4ec96f4668cdf037cc364244761239f79665cd8f1794f3e13ab280 Apr 16 20:02:05.558104 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:05.558024 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" event={"ID":"c738934a-b6dc-4ce0-9172-802752c19936","Type":"ContainerStarted","Data":"1716f2ea2a4ec96f4668cdf037cc364244761239f79665cd8f1794f3e13ab280"} Apr 16 20:02:08.438713 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:08.438677 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 20:02:08.569004 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:08.568966 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" event={"ID":"c738934a-b6dc-4ce0-9172-802752c19936","Type":"ContainerStarted","Data":"f7eb588e9a446b0b95231e947ae743160112d724787bb219cc5c4b028eb9dc69"} Apr 16 20:02:08.569188 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:08.569101 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:08.586500 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:08.586454 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" podStartSLOduration=2.06062531 podStartE2EDuration="4.586437316s" podCreationTimestamp="2026-04-16 20:02:04 +0000 UTC" firstStartedPulling="2026-04-16 20:02:05.430000815 +0000 UTC m=+485.106002869" lastFinishedPulling="2026-04-16 20:02:07.955812811 +0000 UTC m=+487.631814875" observedRunningTime="2026-04-16 20:02:08.585205495 +0000 UTC m=+488.261207572" watchObservedRunningTime="2026-04-16 20:02:08.586437316 +0000 UTC m=+488.262439393" Apr 16 20:02:09.373132 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:09.373092 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:02:10.896945 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:10.896907 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 16 20:02:14.577495 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:14.577464 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:18.438152 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:18.438111 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 20:02:18.672229 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:18.672192 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx"] Apr 16 20:02:18.672488 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:18.672450 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" podUID="c738934a-b6dc-4ce0-9172-802752c19936" containerName="switch-graph-1bf21" containerID="cri-o://f7eb588e9a446b0b95231e947ae743160112d724787bb219cc5c4b028eb9dc69" gracePeriod=30 Apr 16 20:02:19.576063 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:19.576027 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" podUID="c738934a-b6dc-4ce0-9172-802752c19936" containerName="switch-graph-1bf21" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:02:20.898067 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:20.898038 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" Apr 16 20:02:24.575687 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:24.575650 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" podUID="c738934a-b6dc-4ce0-9172-802752c19936" containerName="switch-graph-1bf21" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:02:28.439367 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:28.439338 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" Apr 16 20:02:29.575681 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:29.575640 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" podUID="c738934a-b6dc-4ce0-9172-802752c19936" containerName="switch-graph-1bf21" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:02:29.576157 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:29.575754 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:34.575906 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:34.575864 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" podUID="c738934a-b6dc-4ce0-9172-802752c19936" containerName="switch-graph-1bf21" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:02:39.575722 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:39.575685 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" podUID="c738934a-b6dc-4ce0-9172-802752c19936" containerName="switch-graph-1bf21" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:02:44.403052 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:44.402972 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x"] Apr 16 20:02:44.405917 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:44.405891 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:02:44.408766 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:44.408735 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 16 20:02:44.408766 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:44.408735 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 16 20:02:44.414468 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:44.414447 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x"] Apr 16 20:02:44.552858 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:44.552825 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdf86144-544d-4592-9cf9-e35b28dbaf48-proxy-tls\") pod \"model-chainer-77968cf4b8-6522x\" (UID: \"bdf86144-544d-4592-9cf9-e35b28dbaf48\") " pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:02:44.553014 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:44.552872 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf86144-544d-4592-9cf9-e35b28dbaf48-openshift-service-ca-bundle\") pod \"model-chainer-77968cf4b8-6522x\" (UID: \"bdf86144-544d-4592-9cf9-e35b28dbaf48\") " pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:02:44.576296 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:44.576254 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" podUID="c738934a-b6dc-4ce0-9172-802752c19936" containerName="switch-graph-1bf21" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:02:44.653753 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:44.653680 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdf86144-544d-4592-9cf9-e35b28dbaf48-proxy-tls\") pod \"model-chainer-77968cf4b8-6522x\" (UID: \"bdf86144-544d-4592-9cf9-e35b28dbaf48\") " pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:02:44.653753 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:44.653728 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf86144-544d-4592-9cf9-e35b28dbaf48-openshift-service-ca-bundle\") pod \"model-chainer-77968cf4b8-6522x\" (UID: \"bdf86144-544d-4592-9cf9-e35b28dbaf48\") " pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:02:44.653938 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:02:44.653833 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-serving-cert: secret "model-chainer-serving-cert" not found Apr 16 20:02:44.653938 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:02:44.653911 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdf86144-544d-4592-9cf9-e35b28dbaf48-proxy-tls podName:bdf86144-544d-4592-9cf9-e35b28dbaf48 nodeName:}" failed. No retries permitted until 2026-04-16 20:02:45.153890533 +0000 UTC m=+524.829892586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/bdf86144-544d-4592-9cf9-e35b28dbaf48-proxy-tls") pod "model-chainer-77968cf4b8-6522x" (UID: "bdf86144-544d-4592-9cf9-e35b28dbaf48") : secret "model-chainer-serving-cert" not found Apr 16 20:02:44.654439 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:44.654418 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf86144-544d-4592-9cf9-e35b28dbaf48-openshift-service-ca-bundle\") pod \"model-chainer-77968cf4b8-6522x\" (UID: \"bdf86144-544d-4592-9cf9-e35b28dbaf48\") " pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:02:45.157899 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:45.157866 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdf86144-544d-4592-9cf9-e35b28dbaf48-proxy-tls\") pod \"model-chainer-77968cf4b8-6522x\" (UID: \"bdf86144-544d-4592-9cf9-e35b28dbaf48\") " pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:02:45.160140 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:45.160119 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdf86144-544d-4592-9cf9-e35b28dbaf48-proxy-tls\") pod \"model-chainer-77968cf4b8-6522x\" (UID: \"bdf86144-544d-4592-9cf9-e35b28dbaf48\") " pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:02:45.316748 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:45.316721 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:02:45.437484 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:45.437457 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x"] Apr 16 20:02:45.439322 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:02:45.439284 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdf86144_544d_4592_9cf9_e35b28dbaf48.slice/crio-acc500f269b64ff99d517f3f5c6751b7edfd05d8700d4dee67c43a2881e767b8 WatchSource:0}: Error finding container acc500f269b64ff99d517f3f5c6751b7edfd05d8700d4dee67c43a2881e767b8: Status 404 returned error can't find the container with id acc500f269b64ff99d517f3f5c6751b7edfd05d8700d4dee67c43a2881e767b8 Apr 16 20:02:45.691053 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:45.690966 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" event={"ID":"bdf86144-544d-4592-9cf9-e35b28dbaf48","Type":"ContainerStarted","Data":"4a86a58fb259c9f6bc428b98820c05375b756d5caa700cfda837727e59319d2d"} Apr 16 20:02:45.691053 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:45.691010 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" event={"ID":"bdf86144-544d-4592-9cf9-e35b28dbaf48","Type":"ContainerStarted","Data":"acc500f269b64ff99d517f3f5c6751b7edfd05d8700d4dee67c43a2881e767b8"} Apr 16 20:02:45.691264 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:45.691103 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:02:45.709377 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:45.709336 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" podStartSLOduration=1.709320489 podStartE2EDuration="1.709320489s" podCreationTimestamp="2026-04-16 20:02:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:02:45.707497991 +0000 UTC m=+525.383500068" watchObservedRunningTime="2026-04-16 20:02:45.709320489 +0000 UTC m=+525.385322565" Apr 16 20:02:48.701825 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:48.701792 2568 generic.go:358] "Generic (PLEG): container finished" podID="c738934a-b6dc-4ce0-9172-802752c19936" containerID="f7eb588e9a446b0b95231e947ae743160112d724787bb219cc5c4b028eb9dc69" exitCode=137 Apr 16 20:02:48.702153 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:48.701866 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" event={"ID":"c738934a-b6dc-4ce0-9172-802752c19936","Type":"ContainerDied","Data":"f7eb588e9a446b0b95231e947ae743160112d724787bb219cc5c4b028eb9dc69"} Apr 16 20:02:48.812143 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:48.812119 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:48.884192 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:48.884142 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c738934a-b6dc-4ce0-9172-802752c19936-proxy-tls\") pod \"c738934a-b6dc-4ce0-9172-802752c19936\" (UID: \"c738934a-b6dc-4ce0-9172-802752c19936\") " Apr 16 20:02:48.884368 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:48.884249 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c738934a-b6dc-4ce0-9172-802752c19936-openshift-service-ca-bundle\") pod \"c738934a-b6dc-4ce0-9172-802752c19936\" (UID: \"c738934a-b6dc-4ce0-9172-802752c19936\") " Apr 16 20:02:48.884581 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:48.884558 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c738934a-b6dc-4ce0-9172-802752c19936-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c738934a-b6dc-4ce0-9172-802752c19936" (UID: "c738934a-b6dc-4ce0-9172-802752c19936"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:02:48.886334 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:48.886308 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c738934a-b6dc-4ce0-9172-802752c19936-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c738934a-b6dc-4ce0-9172-802752c19936" (UID: "c738934a-b6dc-4ce0-9172-802752c19936"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:02:48.985121 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:48.985049 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c738934a-b6dc-4ce0-9172-802752c19936-openshift-service-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:02:48.985121 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:48.985077 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c738934a-b6dc-4ce0-9172-802752c19936-proxy-tls\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:02:49.706377 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:49.706350 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" Apr 16 20:02:49.706831 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:49.706350 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx" event={"ID":"c738934a-b6dc-4ce0-9172-802752c19936","Type":"ContainerDied","Data":"1716f2ea2a4ec96f4668cdf037cc364244761239f79665cd8f1794f3e13ab280"} Apr 16 20:02:49.706831 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:49.706466 2568 scope.go:117] "RemoveContainer" containerID="f7eb588e9a446b0b95231e947ae743160112d724787bb219cc5c4b028eb9dc69" Apr 16 20:02:49.723249 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:49.723226 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx"] Apr 16 20:02:49.726871 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:49.726851 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-1bf21-b5cd598bc-pm5nx"] Apr 16 20:02:50.897423 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:50.897387 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c738934a-b6dc-4ce0-9172-802752c19936" path="/var/lib/kubelet/pods/c738934a-b6dc-4ce0-9172-802752c19936/volumes" Apr 16 20:02:51.700384 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:51.700358 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:02:54.497519 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:54.497483 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x"] Apr 16 20:02:54.497893 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:54.497681 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" podUID="bdf86144-544d-4592-9cf9-e35b28dbaf48" containerName="model-chainer" containerID="cri-o://4a86a58fb259c9f6bc428b98820c05375b756d5caa700cfda837727e59319d2d" gracePeriod=30 Apr 16 20:02:54.665353 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:54.665322 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v"] Apr 16 20:02:54.665642 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:54.665617 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerName="kserve-container" containerID="cri-o://3b087c726cdbab5a58fea9a864fac9a3d139a596f30561e6ddb7ba42479c2edd" gracePeriod=30 Apr 16 20:02:54.754236 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:54.754209 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p"] Apr 16 20:02:54.754469 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:54.754450 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="kserve-container" containerID="cri-o://c8251fe1f4f45958630cfe807c299d9a64f9dabc17c4472689c6ffaa80d197d6" gracePeriod=30 Apr 16 20:02:56.698807 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:56.698765 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" podUID="bdf86144-544d-4592-9cf9-e35b28dbaf48" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:02:57.999521 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:57.999500 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" Apr 16 20:02:58.160304 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.160212 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3264742a-0448-4b74-89ac-1eaae6425d2e-kserve-provision-location\") pod \"3264742a-0448-4b74-89ac-1eaae6425d2e\" (UID: \"3264742a-0448-4b74-89ac-1eaae6425d2e\") " Apr 16 20:02:58.160540 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.160514 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3264742a-0448-4b74-89ac-1eaae6425d2e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3264742a-0448-4b74-89ac-1eaae6425d2e" (UID: "3264742a-0448-4b74-89ac-1eaae6425d2e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:02:58.261674 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.261640 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3264742a-0448-4b74-89ac-1eaae6425d2e-kserve-provision-location\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:02:58.696883 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.696861 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" Apr 16 20:02:58.737681 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.737597 2568 generic.go:358] "Generic (PLEG): container finished" podID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerID="c8251fe1f4f45958630cfe807c299d9a64f9dabc17c4472689c6ffaa80d197d6" exitCode=0 Apr 16 20:02:58.737832 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.737680 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" Apr 16 20:02:58.737832 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.737679 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" event={"ID":"6fde0b41-7529-4d1e-8fab-c2bbff747b35","Type":"ContainerDied","Data":"c8251fe1f4f45958630cfe807c299d9a64f9dabc17c4472689c6ffaa80d197d6"} Apr 16 20:02:58.737832 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.737810 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p" event={"ID":"6fde0b41-7529-4d1e-8fab-c2bbff747b35","Type":"ContainerDied","Data":"2e112797a0838b28d9a84c085156c74021fee38c6701970779270c833a264b98"} Apr 16 20:02:58.738020 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.737835 2568 scope.go:117] "RemoveContainer" containerID="c8251fe1f4f45958630cfe807c299d9a64f9dabc17c4472689c6ffaa80d197d6" Apr 16 20:02:58.739065 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.739043 2568 generic.go:358] "Generic (PLEG): container finished" podID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerID="3b087c726cdbab5a58fea9a864fac9a3d139a596f30561e6ddb7ba42479c2edd" exitCode=0 Apr 16 20:02:58.739183 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.739118 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" Apr 16 20:02:58.739183 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.739121 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" event={"ID":"3264742a-0448-4b74-89ac-1eaae6425d2e","Type":"ContainerDied","Data":"3b087c726cdbab5a58fea9a864fac9a3d139a596f30561e6ddb7ba42479c2edd"} Apr 16 20:02:58.739183 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.739158 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v" event={"ID":"3264742a-0448-4b74-89ac-1eaae6425d2e","Type":"ContainerDied","Data":"961c4b1d0a5a55315323ada72ebc962c7d57d7ea741ff6730734926ab5bdadce"} Apr 16 20:02:58.745937 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.745919 2568 scope.go:117] "RemoveContainer" containerID="ea0a9d61dde539030a99b7fc43bc37c49df97e12e7e03b632aa0fd760cff0d9a" Apr 16 20:02:58.752962 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.752945 2568 scope.go:117] "RemoveContainer" containerID="c8251fe1f4f45958630cfe807c299d9a64f9dabc17c4472689c6ffaa80d197d6" Apr 16 20:02:58.753267 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:02:58.753192 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8251fe1f4f45958630cfe807c299d9a64f9dabc17c4472689c6ffaa80d197d6\": container with ID starting with c8251fe1f4f45958630cfe807c299d9a64f9dabc17c4472689c6ffaa80d197d6 not found: ID does not exist" containerID="c8251fe1f4f45958630cfe807c299d9a64f9dabc17c4472689c6ffaa80d197d6" Apr 16 20:02:58.753267 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.753224 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8251fe1f4f45958630cfe807c299d9a64f9dabc17c4472689c6ffaa80d197d6"} err="failed to get container status \"c8251fe1f4f45958630cfe807c299d9a64f9dabc17c4472689c6ffaa80d197d6\": rpc error: code = NotFound desc = could not find container \"c8251fe1f4f45958630cfe807c299d9a64f9dabc17c4472689c6ffaa80d197d6\": container with ID starting with c8251fe1f4f45958630cfe807c299d9a64f9dabc17c4472689c6ffaa80d197d6 not found: ID does not exist" Apr 16 20:02:58.753267 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.753247 2568 scope.go:117] "RemoveContainer" containerID="ea0a9d61dde539030a99b7fc43bc37c49df97e12e7e03b632aa0fd760cff0d9a" Apr 16 20:02:58.753482 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:02:58.753463 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea0a9d61dde539030a99b7fc43bc37c49df97e12e7e03b632aa0fd760cff0d9a\": container with ID starting with ea0a9d61dde539030a99b7fc43bc37c49df97e12e7e03b632aa0fd760cff0d9a not found: ID does not exist" containerID="ea0a9d61dde539030a99b7fc43bc37c49df97e12e7e03b632aa0fd760cff0d9a" Apr 16 20:02:58.753556 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.753489 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0a9d61dde539030a99b7fc43bc37c49df97e12e7e03b632aa0fd760cff0d9a"} err="failed to get container status \"ea0a9d61dde539030a99b7fc43bc37c49df97e12e7e03b632aa0fd760cff0d9a\": rpc error: code = NotFound desc = could not find container \"ea0a9d61dde539030a99b7fc43bc37c49df97e12e7e03b632aa0fd760cff0d9a\": container with ID starting with ea0a9d61dde539030a99b7fc43bc37c49df97e12e7e03b632aa0fd760cff0d9a not found: ID does not exist" Apr 16 20:02:58.753556 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.753505 2568 scope.go:117] "RemoveContainer" containerID="3b087c726cdbab5a58fea9a864fac9a3d139a596f30561e6ddb7ba42479c2edd" Apr 16 20:02:58.760645 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.760620 2568 scope.go:117] "RemoveContainer" containerID="065aed748b91bf06c3153c8fe43af58ad55b124d8ab5b80e73a0c4e9863bb2bb" Apr 16 20:02:58.762129 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.762109 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v"] Apr 16 20:02:58.765441 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.765420 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-x9n2v"] Apr 16 20:02:58.769700 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.769681 2568 scope.go:117] "RemoveContainer" containerID="3b087c726cdbab5a58fea9a864fac9a3d139a596f30561e6ddb7ba42479c2edd" Apr 16 20:02:58.769965 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:02:58.769948 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b087c726cdbab5a58fea9a864fac9a3d139a596f30561e6ddb7ba42479c2edd\": container with ID starting with 3b087c726cdbab5a58fea9a864fac9a3d139a596f30561e6ddb7ba42479c2edd not found: ID does not exist" containerID="3b087c726cdbab5a58fea9a864fac9a3d139a596f30561e6ddb7ba42479c2edd" Apr 16 20:02:58.770024 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.769970 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b087c726cdbab5a58fea9a864fac9a3d139a596f30561e6ddb7ba42479c2edd"} err="failed to get container status \"3b087c726cdbab5a58fea9a864fac9a3d139a596f30561e6ddb7ba42479c2edd\": rpc error: code = NotFound desc = could not find container \"3b087c726cdbab5a58fea9a864fac9a3d139a596f30561e6ddb7ba42479c2edd\": container with ID starting with 3b087c726cdbab5a58fea9a864fac9a3d139a596f30561e6ddb7ba42479c2edd not found: ID does not exist" Apr 16 20:02:58.770024 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.769994 2568 scope.go:117] "RemoveContainer" containerID="065aed748b91bf06c3153c8fe43af58ad55b124d8ab5b80e73a0c4e9863bb2bb" Apr 16 20:02:58.770256 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:02:58.770240 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"065aed748b91bf06c3153c8fe43af58ad55b124d8ab5b80e73a0c4e9863bb2bb\": container with ID starting with 065aed748b91bf06c3153c8fe43af58ad55b124d8ab5b80e73a0c4e9863bb2bb not found: ID does not exist" containerID="065aed748b91bf06c3153c8fe43af58ad55b124d8ab5b80e73a0c4e9863bb2bb" Apr 16 20:02:58.770322 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.770274 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065aed748b91bf06c3153c8fe43af58ad55b124d8ab5b80e73a0c4e9863bb2bb"} err="failed to get container status \"065aed748b91bf06c3153c8fe43af58ad55b124d8ab5b80e73a0c4e9863bb2bb\": rpc error: code = NotFound desc = could not find container \"065aed748b91bf06c3153c8fe43af58ad55b124d8ab5b80e73a0c4e9863bb2bb\": container with ID starting with 065aed748b91bf06c3153c8fe43af58ad55b124d8ab5b80e73a0c4e9863bb2bb not found: ID does not exist" Apr 16 20:02:58.866082 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.866054 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fde0b41-7529-4d1e-8fab-c2bbff747b35-kserve-provision-location\") pod \"6fde0b41-7529-4d1e-8fab-c2bbff747b35\" (UID: \"6fde0b41-7529-4d1e-8fab-c2bbff747b35\") " Apr 16 20:02:58.866396 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.866375 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fde0b41-7529-4d1e-8fab-c2bbff747b35-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6fde0b41-7529-4d1e-8fab-c2bbff747b35" (UID: "6fde0b41-7529-4d1e-8fab-c2bbff747b35"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:02:58.897132 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.897103 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" path="/var/lib/kubelet/pods/3264742a-0448-4b74-89ac-1eaae6425d2e/volumes" Apr 16 20:02:58.967467 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:58.967439 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fde0b41-7529-4d1e-8fab-c2bbff747b35-kserve-provision-location\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:02:59.053546 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:59.053521 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p"] Apr 16 20:02:59.057316 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:02:59.057291 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-v922p"] Apr 16 20:03:00.897512 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:00.897476 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" path="/var/lib/kubelet/pods/6fde0b41-7529-4d1e-8fab-c2bbff747b35/volumes" Apr 16 20:03:01.698628 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:01.698589 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" podUID="bdf86144-544d-4592-9cf9-e35b28dbaf48" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:06.698694 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:06.698659 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" podUID="bdf86144-544d-4592-9cf9-e35b28dbaf48" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:06.699075 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:06.698757 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:03:11.698816 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:11.698780 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" podUID="bdf86144-544d-4592-9cf9-e35b28dbaf48" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:16.698796 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:16.698761 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" podUID="bdf86144-544d-4592-9cf9-e35b28dbaf48" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:21.698983 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:21.698946 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" podUID="bdf86144-544d-4592-9cf9-e35b28dbaf48" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:03:24.690064 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.690040 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:03:24.758506 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.758441 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf86144-544d-4592-9cf9-e35b28dbaf48-openshift-service-ca-bundle\") pod \"bdf86144-544d-4592-9cf9-e35b28dbaf48\" (UID: \"bdf86144-544d-4592-9cf9-e35b28dbaf48\") " Apr 16 20:03:24.758627 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.758531 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdf86144-544d-4592-9cf9-e35b28dbaf48-proxy-tls\") pod \"bdf86144-544d-4592-9cf9-e35b28dbaf48\" (UID: \"bdf86144-544d-4592-9cf9-e35b28dbaf48\") " Apr 16 20:03:24.758688 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.758668 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdf86144-544d-4592-9cf9-e35b28dbaf48-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "bdf86144-544d-4592-9cf9-e35b28dbaf48" (UID: "bdf86144-544d-4592-9cf9-e35b28dbaf48"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:03:24.760510 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.760489 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf86144-544d-4592-9cf9-e35b28dbaf48-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bdf86144-544d-4592-9cf9-e35b28dbaf48" (UID: "bdf86144-544d-4592-9cf9-e35b28dbaf48"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:03:24.818956 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.818926 2568 generic.go:358] "Generic (PLEG): container finished" podID="bdf86144-544d-4592-9cf9-e35b28dbaf48" containerID="4a86a58fb259c9f6bc428b98820c05375b756d5caa700cfda837727e59319d2d" exitCode=0 Apr 16 20:03:24.819055 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.818987 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" Apr 16 20:03:24.819055 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.819009 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" event={"ID":"bdf86144-544d-4592-9cf9-e35b28dbaf48","Type":"ContainerDied","Data":"4a86a58fb259c9f6bc428b98820c05375b756d5caa700cfda837727e59319d2d"} Apr 16 20:03:24.819055 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.819046 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x" event={"ID":"bdf86144-544d-4592-9cf9-e35b28dbaf48","Type":"ContainerDied","Data":"acc500f269b64ff99d517f3f5c6751b7edfd05d8700d4dee67c43a2881e767b8"} Apr 16 20:03:24.819151 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.819068 2568 scope.go:117] "RemoveContainer" containerID="4a86a58fb259c9f6bc428b98820c05375b756d5caa700cfda837727e59319d2d" Apr 16 20:03:24.830936 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.830917 2568 scope.go:117] "RemoveContainer" containerID="4a86a58fb259c9f6bc428b98820c05375b756d5caa700cfda837727e59319d2d" Apr 16 20:03:24.831240 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:03:24.831218 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a86a58fb259c9f6bc428b98820c05375b756d5caa700cfda837727e59319d2d\": container with ID starting with 4a86a58fb259c9f6bc428b98820c05375b756d5caa700cfda837727e59319d2d not found: ID does not exist" containerID="4a86a58fb259c9f6bc428b98820c05375b756d5caa700cfda837727e59319d2d" Apr 16 20:03:24.831296 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.831254 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a86a58fb259c9f6bc428b98820c05375b756d5caa700cfda837727e59319d2d"} err="failed to get container status \"4a86a58fb259c9f6bc428b98820c05375b756d5caa700cfda837727e59319d2d\": rpc error: code = NotFound desc = could not find container \"4a86a58fb259c9f6bc428b98820c05375b756d5caa700cfda837727e59319d2d\": container with ID starting with 4a86a58fb259c9f6bc428b98820c05375b756d5caa700cfda837727e59319d2d not found: ID does not exist" Apr 16 20:03:24.842840 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.842818 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x"] Apr 16 20:03:24.847917 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.847898 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-77968cf4b8-6522x"] Apr 16 20:03:24.859316 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.859297 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdf86144-544d-4592-9cf9-e35b28dbaf48-openshift-service-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:03:24.859391 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.859320 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bdf86144-544d-4592-9cf9-e35b28dbaf48-proxy-tls\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:03:24.897201 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:24.897161 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdf86144-544d-4592-9cf9-e35b28dbaf48" path="/var/lib/kubelet/pods/bdf86144-544d-4592-9cf9-e35b28dbaf48/volumes" Apr 16 20:03:28.916265 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916234 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp"] Apr 16 20:03:28.916621 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916521 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdf86144-544d-4592-9cf9-e35b28dbaf48" containerName="model-chainer" Apr 16 20:03:28.916621 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916533 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf86144-544d-4592-9cf9-e35b28dbaf48" containerName="model-chainer" Apr 16 20:03:28.916621 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916543 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="kserve-container" Apr 16 20:03:28.916621 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916549 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="kserve-container" Apr 16 20:03:28.916621 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916559 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerName="kserve-container" Apr 16 20:03:28.916621 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916564 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerName="kserve-container" Apr 16 20:03:28.916621 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916571 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="storage-initializer" Apr 16 20:03:28.916621 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916576 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="storage-initializer" Apr 16 20:03:28.916621 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916585 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c738934a-b6dc-4ce0-9172-802752c19936" containerName="switch-graph-1bf21" Apr 16 20:03:28.916621 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916590 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c738934a-b6dc-4ce0-9172-802752c19936" containerName="switch-graph-1bf21" Apr 16 20:03:28.916621 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916598 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerName="storage-initializer" Apr 16 20:03:28.916621 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916603 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerName="storage-initializer" Apr 16 20:03:28.916962 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916665 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fde0b41-7529-4d1e-8fab-c2bbff747b35" containerName="kserve-container" Apr 16 20:03:28.916962 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916672 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="3264742a-0448-4b74-89ac-1eaae6425d2e" containerName="kserve-container" Apr 16 20:03:28.916962 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916679 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdf86144-544d-4592-9cf9-e35b28dbaf48" containerName="model-chainer" Apr 16 20:03:28.916962 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.916686 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c738934a-b6dc-4ce0-9172-802752c19936" containerName="switch-graph-1bf21" Apr 16 20:03:28.921130 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.921113 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:03:28.924393 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.924103 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jlhx9\"" Apr 16 20:03:28.924393 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.924100 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:03:28.924393 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.924200 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-3d0d2-serving-cert\"" Apr 16 20:03:28.925632 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.925610 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-3d0d2-kube-rbac-proxy-sar-config\"" Apr 16 20:03:28.927469 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.927429 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp"] Apr 16 20:03:28.988577 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.988545 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15cb7d3-244f-4259-9938-976efcabc968-openshift-service-ca-bundle\") pod \"switch-graph-3d0d2-5f445cd894-whsfp\" (UID: \"d15cb7d3-244f-4259-9938-976efcabc968\") " pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:03:28.988699 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:28.988682 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d15cb7d3-244f-4259-9938-976efcabc968-proxy-tls\") pod \"switch-graph-3d0d2-5f445cd894-whsfp\" (UID: \"d15cb7d3-244f-4259-9938-976efcabc968\") " pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:03:29.089371 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:29.089347 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d15cb7d3-244f-4259-9938-976efcabc968-proxy-tls\") pod \"switch-graph-3d0d2-5f445cd894-whsfp\" (UID: \"d15cb7d3-244f-4259-9938-976efcabc968\") " pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:03:29.089485 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:29.089408 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15cb7d3-244f-4259-9938-976efcabc968-openshift-service-ca-bundle\") pod \"switch-graph-3d0d2-5f445cd894-whsfp\" (UID: \"d15cb7d3-244f-4259-9938-976efcabc968\") " pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:03:29.089527 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:03:29.089480 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-3d0d2-serving-cert: secret "switch-graph-3d0d2-serving-cert" not found Apr 16 20:03:29.089565 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:03:29.089546 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d15cb7d3-244f-4259-9938-976efcabc968-proxy-tls podName:d15cb7d3-244f-4259-9938-976efcabc968 nodeName:}" failed. No retries permitted until 2026-04-16 20:03:29.589530033 +0000 UTC m=+569.265532086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d15cb7d3-244f-4259-9938-976efcabc968-proxy-tls") pod "switch-graph-3d0d2-5f445cd894-whsfp" (UID: "d15cb7d3-244f-4259-9938-976efcabc968") : secret "switch-graph-3d0d2-serving-cert" not found Apr 16 20:03:29.089946 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:29.089928 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15cb7d3-244f-4259-9938-976efcabc968-openshift-service-ca-bundle\") pod \"switch-graph-3d0d2-5f445cd894-whsfp\" (UID: \"d15cb7d3-244f-4259-9938-976efcabc968\") " pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:03:29.593375 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:29.593345 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d15cb7d3-244f-4259-9938-976efcabc968-proxy-tls\") pod \"switch-graph-3d0d2-5f445cd894-whsfp\" (UID: \"d15cb7d3-244f-4259-9938-976efcabc968\") " pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:03:29.595672 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:29.595651 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d15cb7d3-244f-4259-9938-976efcabc968-proxy-tls\") pod \"switch-graph-3d0d2-5f445cd894-whsfp\" (UID: \"d15cb7d3-244f-4259-9938-976efcabc968\") " pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:03:29.832247 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:29.832208 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:03:29.948486 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:29.948463 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp"] Apr 16 20:03:29.950689 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:03:29.950663 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd15cb7d3_244f_4259_9938_976efcabc968.slice/crio-a70a80a8b238fd07f5e22dc5e654f630f179959285ec0d58bf2dd8fbf310c8df WatchSource:0}: Error finding container a70a80a8b238fd07f5e22dc5e654f630f179959285ec0d58bf2dd8fbf310c8df: Status 404 returned error can't find the container with id a70a80a8b238fd07f5e22dc5e654f630f179959285ec0d58bf2dd8fbf310c8df Apr 16 20:03:30.838890 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:30.838854 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" event={"ID":"d15cb7d3-244f-4259-9938-976efcabc968","Type":"ContainerStarted","Data":"412f0fda88e476ed286efc981169f072047d5e0f9ac24b830c9adf90e98da732"} Apr 16 20:03:30.838890 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:30.838889 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" event={"ID":"d15cb7d3-244f-4259-9938-976efcabc968","Type":"ContainerStarted","Data":"a70a80a8b238fd07f5e22dc5e654f630f179959285ec0d58bf2dd8fbf310c8df"} Apr 16 20:03:30.839086 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:30.839003 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:03:30.856291 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:30.856248 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" podStartSLOduration=2.856234943 podStartE2EDuration="2.856234943s" podCreationTimestamp="2026-04-16 20:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:03:30.855522188 +0000 UTC m=+570.531524264" watchObservedRunningTime="2026-04-16 20:03:30.856234943 +0000 UTC m=+570.532237018" Apr 16 20:03:36.848811 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:03:36.848783 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:04:04.692832 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:04.692794 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9"] Apr 16 20:04:04.696298 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:04.696282 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:04:04.698987 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:04.698964 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-99283-serving-cert\"" Apr 16 20:04:04.699116 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:04.698964 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-99283-kube-rbac-proxy-sar-config\"" Apr 16 20:04:04.702060 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:04.702034 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9"] Apr 16 20:04:04.869006 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:04.868973 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b94a4630-744f-491c-8350-af9ca0eb5a26-openshift-service-ca-bundle\") pod \"sequence-graph-99283-794d7b9b66-gv4p9\" (UID: \"b94a4630-744f-491c-8350-af9ca0eb5a26\") " pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:04:04.869195 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:04.869015 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b94a4630-744f-491c-8350-af9ca0eb5a26-proxy-tls\") pod \"sequence-graph-99283-794d7b9b66-gv4p9\" (UID: \"b94a4630-744f-491c-8350-af9ca0eb5a26\") " pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:04:04.970353 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:04.970271 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b94a4630-744f-491c-8350-af9ca0eb5a26-proxy-tls\") pod \"sequence-graph-99283-794d7b9b66-gv4p9\" (UID: \"b94a4630-744f-491c-8350-af9ca0eb5a26\") " pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:04:04.970507 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:04.970376 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b94a4630-744f-491c-8350-af9ca0eb5a26-openshift-service-ca-bundle\") pod \"sequence-graph-99283-794d7b9b66-gv4p9\" (UID: \"b94a4630-744f-491c-8350-af9ca0eb5a26\") " pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:04:04.970507 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:04:04.970431 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-99283-serving-cert: secret "sequence-graph-99283-serving-cert" not found Apr 16 20:04:04.970507 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:04:04.970499 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94a4630-744f-491c-8350-af9ca0eb5a26-proxy-tls podName:b94a4630-744f-491c-8350-af9ca0eb5a26 nodeName:}" failed. No retries permitted until 2026-04-16 20:04:05.470476611 +0000 UTC m=+605.146478671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b94a4630-744f-491c-8350-af9ca0eb5a26-proxy-tls") pod "sequence-graph-99283-794d7b9b66-gv4p9" (UID: "b94a4630-744f-491c-8350-af9ca0eb5a26") : secret "sequence-graph-99283-serving-cert" not found Apr 16 20:04:04.971005 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:04.970987 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b94a4630-744f-491c-8350-af9ca0eb5a26-openshift-service-ca-bundle\") pod \"sequence-graph-99283-794d7b9b66-gv4p9\" (UID: \"b94a4630-744f-491c-8350-af9ca0eb5a26\") " pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:04:05.475833 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:05.475797 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b94a4630-744f-491c-8350-af9ca0eb5a26-proxy-tls\") pod \"sequence-graph-99283-794d7b9b66-gv4p9\" (UID: \"b94a4630-744f-491c-8350-af9ca0eb5a26\") " pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:04:05.478081 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:05.478063 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b94a4630-744f-491c-8350-af9ca0eb5a26-proxy-tls\") pod \"sequence-graph-99283-794d7b9b66-gv4p9\" (UID: \"b94a4630-744f-491c-8350-af9ca0eb5a26\") " pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:04:05.608044 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:05.608012 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:04:05.724297 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:05.724276 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9"] Apr 16 20:04:05.726231 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:04:05.726158 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb94a4630_744f_491c_8350_af9ca0eb5a26.slice/crio-fe9440ea82ea52e8dcc8614ef2fc2f95395d8f52e9af16400b3a6d6ca64af644 WatchSource:0}: Error finding container fe9440ea82ea52e8dcc8614ef2fc2f95395d8f52e9af16400b3a6d6ca64af644: Status 404 returned error can't find the container with id fe9440ea82ea52e8dcc8614ef2fc2f95395d8f52e9af16400b3a6d6ca64af644 Apr 16 20:04:05.943394 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:05.943356 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" event={"ID":"b94a4630-744f-491c-8350-af9ca0eb5a26","Type":"ContainerStarted","Data":"132a8d100ba0a6fd48c13d952a0eccb9eecce1d52364309bd8e96ffc607d42a0"} Apr 16 20:04:05.943565 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:05.943399 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" event={"ID":"b94a4630-744f-491c-8350-af9ca0eb5a26","Type":"ContainerStarted","Data":"fe9440ea82ea52e8dcc8614ef2fc2f95395d8f52e9af16400b3a6d6ca64af644"} Apr 16 20:04:05.943565 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:05.943439 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:04:05.960260 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:05.960217 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" podStartSLOduration=1.960203546 podStartE2EDuration="1.960203546s" podCreationTimestamp="2026-04-16 20:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:04:05.959488569 +0000 UTC m=+605.635490643" watchObservedRunningTime="2026-04-16 20:04:05.960203546 +0000 UTC m=+605.636205622" Apr 16 20:04:11.952333 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:04:11.952300 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:11:43.742667 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:11:43.742593 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp"] Apr 16 20:11:43.743156 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:11:43.742841 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" podUID="d15cb7d3-244f-4259-9938-976efcabc968" containerName="switch-graph-3d0d2" containerID="cri-o://412f0fda88e476ed286efc981169f072047d5e0f9ac24b830c9adf90e98da732" gracePeriod=30 Apr 16 20:11:46.846165 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:11:46.846125 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" podUID="d15cb7d3-244f-4259-9938-976efcabc968" containerName="switch-graph-3d0d2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:11:51.846789 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:11:51.846748 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" podUID="d15cb7d3-244f-4259-9938-976efcabc968" containerName="switch-graph-3d0d2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:11:56.845935 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:11:56.845890 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" podUID="d15cb7d3-244f-4259-9938-976efcabc968" containerName="switch-graph-3d0d2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:11:56.846422 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:11:56.846042 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:12:01.847080 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:01.847043 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" podUID="d15cb7d3-244f-4259-9938-976efcabc968" containerName="switch-graph-3d0d2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:12:06.846372 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:06.846335 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" podUID="d15cb7d3-244f-4259-9938-976efcabc968" containerName="switch-graph-3d0d2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:12:11.846337 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:11.846300 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" podUID="d15cb7d3-244f-4259-9938-976efcabc968" containerName="switch-graph-3d0d2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:12:13.888568 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:13.888545 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:12:13.941495 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:13.941462 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15cb7d3-244f-4259-9938-976efcabc968-openshift-service-ca-bundle\") pod \"d15cb7d3-244f-4259-9938-976efcabc968\" (UID: \"d15cb7d3-244f-4259-9938-976efcabc968\") " Apr 16 20:12:13.941653 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:13.941583 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d15cb7d3-244f-4259-9938-976efcabc968-proxy-tls\") pod \"d15cb7d3-244f-4259-9938-976efcabc968\" (UID: \"d15cb7d3-244f-4259-9938-976efcabc968\") " Apr 16 20:12:13.941801 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:13.941779 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15cb7d3-244f-4259-9938-976efcabc968-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d15cb7d3-244f-4259-9938-976efcabc968" (UID: "d15cb7d3-244f-4259-9938-976efcabc968"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:12:13.943523 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:13.943503 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15cb7d3-244f-4259-9938-976efcabc968-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d15cb7d3-244f-4259-9938-976efcabc968" (UID: "d15cb7d3-244f-4259-9938-976efcabc968"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:12:14.042593 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:14.042560 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d15cb7d3-244f-4259-9938-976efcabc968-openshift-service-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:12:14.042593 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:14.042585 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d15cb7d3-244f-4259-9938-976efcabc968-proxy-tls\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:12:14.339717 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:14.339624 2568 generic.go:358] "Generic (PLEG): container finished" podID="d15cb7d3-244f-4259-9938-976efcabc968" containerID="412f0fda88e476ed286efc981169f072047d5e0f9ac24b830c9adf90e98da732" exitCode=0 Apr 16 20:12:14.339717 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:14.339682 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" event={"ID":"d15cb7d3-244f-4259-9938-976efcabc968","Type":"ContainerDied","Data":"412f0fda88e476ed286efc981169f072047d5e0f9ac24b830c9adf90e98da732"} Apr 16 20:12:14.339717 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:14.339714 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" event={"ID":"d15cb7d3-244f-4259-9938-976efcabc968","Type":"ContainerDied","Data":"a70a80a8b238fd07f5e22dc5e654f630f179959285ec0d58bf2dd8fbf310c8df"} Apr 16 20:12:14.339990 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:14.339716 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp" Apr 16 20:12:14.339990 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:14.339733 2568 scope.go:117] "RemoveContainer" containerID="412f0fda88e476ed286efc981169f072047d5e0f9ac24b830c9adf90e98da732" Apr 16 20:12:14.350028 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:14.350008 2568 scope.go:117] "RemoveContainer" containerID="412f0fda88e476ed286efc981169f072047d5e0f9ac24b830c9adf90e98da732" Apr 16 20:12:14.350318 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:12:14.350295 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"412f0fda88e476ed286efc981169f072047d5e0f9ac24b830c9adf90e98da732\": container with ID starting with 412f0fda88e476ed286efc981169f072047d5e0f9ac24b830c9adf90e98da732 not found: ID does not exist" containerID="412f0fda88e476ed286efc981169f072047d5e0f9ac24b830c9adf90e98da732" Apr 16 20:12:14.350426 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:14.350323 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"412f0fda88e476ed286efc981169f072047d5e0f9ac24b830c9adf90e98da732"} err="failed to get container status \"412f0fda88e476ed286efc981169f072047d5e0f9ac24b830c9adf90e98da732\": rpc error: code = NotFound desc = could not find container \"412f0fda88e476ed286efc981169f072047d5e0f9ac24b830c9adf90e98da732\": container with ID starting with 412f0fda88e476ed286efc981169f072047d5e0f9ac24b830c9adf90e98da732 not found: ID does not exist" Apr 16 20:12:14.364730 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:14.364707 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp"] Apr 16 20:12:14.367930 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:14.367911 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-3d0d2-5f445cd894-whsfp"] Apr 16 20:12:14.897732 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:14.897687 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d15cb7d3-244f-4259-9938-976efcabc968" path="/var/lib/kubelet/pods/d15cb7d3-244f-4259-9938-976efcabc968/volumes" Apr 16 20:12:19.437619 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:19.437587 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9"] Apr 16 20:12:19.438097 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:19.437891 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" podUID="b94a4630-744f-491c-8350-af9ca0eb5a26" containerName="sequence-graph-99283" containerID="cri-o://132a8d100ba0a6fd48c13d952a0eccb9eecce1d52364309bd8e96ffc607d42a0" gracePeriod=30 Apr 16 20:12:21.951000 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:21.950957 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" podUID="b94a4630-744f-491c-8350-af9ca0eb5a26" containerName="sequence-graph-99283" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:12:26.950383 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:26.950343 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" podUID="b94a4630-744f-491c-8350-af9ca0eb5a26" containerName="sequence-graph-99283" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:12:31.950818 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:31.950780 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" podUID="b94a4630-744f-491c-8350-af9ca0eb5a26" containerName="sequence-graph-99283" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:12:31.951311 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:31.950891 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:12:36.950451 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:36.950413 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" podUID="b94a4630-744f-491c-8350-af9ca0eb5a26" containerName="sequence-graph-99283" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:12:41.950490 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:41.950448 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" podUID="b94a4630-744f-491c-8350-af9ca0eb5a26" containerName="sequence-graph-99283" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:12:46.950755 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:46.950714 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" podUID="b94a4630-744f-491c-8350-af9ca0eb5a26" containerName="sequence-graph-99283" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:12:49.586022 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:49.585998 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:12:49.703997 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:49.703925 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b94a4630-744f-491c-8350-af9ca0eb5a26-proxy-tls\") pod \"b94a4630-744f-491c-8350-af9ca0eb5a26\" (UID: \"b94a4630-744f-491c-8350-af9ca0eb5a26\") " Apr 16 20:12:49.704133 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:49.704014 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b94a4630-744f-491c-8350-af9ca0eb5a26-openshift-service-ca-bundle\") pod \"b94a4630-744f-491c-8350-af9ca0eb5a26\" (UID: \"b94a4630-744f-491c-8350-af9ca0eb5a26\") " Apr 16 20:12:49.704371 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:49.704349 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b94a4630-744f-491c-8350-af9ca0eb5a26-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b94a4630-744f-491c-8350-af9ca0eb5a26" (UID: "b94a4630-744f-491c-8350-af9ca0eb5a26"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:12:49.706008 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:49.705988 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b94a4630-744f-491c-8350-af9ca0eb5a26-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b94a4630-744f-491c-8350-af9ca0eb5a26" (UID: "b94a4630-744f-491c-8350-af9ca0eb5a26"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:12:49.804658 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:49.804623 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b94a4630-744f-491c-8350-af9ca0eb5a26-proxy-tls\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:12:49.804658 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:49.804655 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b94a4630-744f-491c-8350-af9ca0eb5a26-openshift-service-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:12:50.447007 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:50.446971 2568 generic.go:358] "Generic (PLEG): container finished" podID="b94a4630-744f-491c-8350-af9ca0eb5a26" containerID="132a8d100ba0a6fd48c13d952a0eccb9eecce1d52364309bd8e96ffc607d42a0" exitCode=0 Apr 16 20:12:50.447200 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:50.447032 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" Apr 16 20:12:50.447200 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:50.447031 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" event={"ID":"b94a4630-744f-491c-8350-af9ca0eb5a26","Type":"ContainerDied","Data":"132a8d100ba0a6fd48c13d952a0eccb9eecce1d52364309bd8e96ffc607d42a0"} Apr 16 20:12:50.447200 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:50.447132 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9" event={"ID":"b94a4630-744f-491c-8350-af9ca0eb5a26","Type":"ContainerDied","Data":"fe9440ea82ea52e8dcc8614ef2fc2f95395d8f52e9af16400b3a6d6ca64af644"} Apr 16 20:12:50.447200 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:50.447147 2568 scope.go:117] "RemoveContainer" containerID="132a8d100ba0a6fd48c13d952a0eccb9eecce1d52364309bd8e96ffc607d42a0" Apr 16 20:12:50.454995 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:50.454980 2568 scope.go:117] "RemoveContainer" containerID="132a8d100ba0a6fd48c13d952a0eccb9eecce1d52364309bd8e96ffc607d42a0" Apr 16 20:12:50.455236 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:12:50.455218 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"132a8d100ba0a6fd48c13d952a0eccb9eecce1d52364309bd8e96ffc607d42a0\": container with ID starting with 132a8d100ba0a6fd48c13d952a0eccb9eecce1d52364309bd8e96ffc607d42a0 not found: ID does not exist" containerID="132a8d100ba0a6fd48c13d952a0eccb9eecce1d52364309bd8e96ffc607d42a0" Apr 16 20:12:50.455295 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:50.455242 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"132a8d100ba0a6fd48c13d952a0eccb9eecce1d52364309bd8e96ffc607d42a0"} err="failed to get container status \"132a8d100ba0a6fd48c13d952a0eccb9eecce1d52364309bd8e96ffc607d42a0\": rpc error: code = NotFound desc = could not find container \"132a8d100ba0a6fd48c13d952a0eccb9eecce1d52364309bd8e96ffc607d42a0\": container with ID starting with 132a8d100ba0a6fd48c13d952a0eccb9eecce1d52364309bd8e96ffc607d42a0 not found: ID does not exist" Apr 16 20:12:50.467312 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:50.467287 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9"] Apr 16 20:12:50.470617 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:50.470598 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-99283-794d7b9b66-gv4p9"] Apr 16 20:12:50.897286 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:50.897260 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b94a4630-744f-491c-8350-af9ca0eb5a26" path="/var/lib/kubelet/pods/b94a4630-744f-491c-8350-af9ca0eb5a26/volumes" Apr 16 20:12:54.003162 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.003131 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk"] Apr 16 20:12:54.003640 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.003596 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b94a4630-744f-491c-8350-af9ca0eb5a26" containerName="sequence-graph-99283" Apr 16 20:12:54.003640 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.003615 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94a4630-744f-491c-8350-af9ca0eb5a26" containerName="sequence-graph-99283" Apr 16 20:12:54.003640 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.003637 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d15cb7d3-244f-4259-9938-976efcabc968" containerName="switch-graph-3d0d2" Apr 16 20:12:54.003798 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.003646 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15cb7d3-244f-4259-9938-976efcabc968" containerName="switch-graph-3d0d2" Apr 16 20:12:54.003798 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.003717 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d15cb7d3-244f-4259-9938-976efcabc968" containerName="switch-graph-3d0d2" Apr 16 20:12:54.003798 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.003736 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b94a4630-744f-491c-8350-af9ca0eb5a26" containerName="sequence-graph-99283" Apr 16 20:12:54.008144 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.008125 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:12:54.010772 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.010748 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:12:54.010892 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.010790 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-de58a-serving-cert\"" Apr 16 20:12:54.010892 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.010748 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-de58a-kube-rbac-proxy-sar-config\"" Apr 16 20:12:54.011861 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.011846 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jlhx9\"" Apr 16 20:12:54.016800 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.016781 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk"] Apr 16 20:12:54.138332 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.138302 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-proxy-tls\") pod \"ensemble-graph-de58a-7d7f775b78-25zfk\" (UID: \"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf\") " pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:12:54.138464 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.138340 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-openshift-service-ca-bundle\") pod \"ensemble-graph-de58a-7d7f775b78-25zfk\" (UID: \"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf\") " pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:12:54.239514 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.239489 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-proxy-tls\") pod \"ensemble-graph-de58a-7d7f775b78-25zfk\" (UID: \"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf\") " pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:12:54.239650 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.239528 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-openshift-service-ca-bundle\") pod \"ensemble-graph-de58a-7d7f775b78-25zfk\" (UID: \"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf\") " pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:12:54.239719 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:12:54.239645 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-de58a-serving-cert: secret "ensemble-graph-de58a-serving-cert" not found Apr 16 20:12:54.239777 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:12:54.239733 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-proxy-tls podName:f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf nodeName:}" failed. No retries permitted until 2026-04-16 20:12:54.739710004 +0000 UTC m=+1134.415712072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-proxy-tls") pod "ensemble-graph-de58a-7d7f775b78-25zfk" (UID: "f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf") : secret "ensemble-graph-de58a-serving-cert" not found Apr 16 20:12:54.240103 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.240085 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-openshift-service-ca-bundle\") pod \"ensemble-graph-de58a-7d7f775b78-25zfk\" (UID: \"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf\") " pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:12:54.743628 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.743600 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-proxy-tls\") pod \"ensemble-graph-de58a-7d7f775b78-25zfk\" (UID: \"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf\") " pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:12:54.745967 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.745944 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-proxy-tls\") pod \"ensemble-graph-de58a-7d7f775b78-25zfk\" (UID: \"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf\") " pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:12:54.919230 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:54.919204 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:12:55.036983 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:55.036948 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk"] Apr 16 20:12:55.040254 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:12:55.040227 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf46f208b_213f_4c52_b9d4_b1c0bc3ae4cf.slice/crio-698389061a1c0639c4a14f9bc8899bb37c8a7ac5dcc7f5526ddb59edd410f4c4 WatchSource:0}: Error finding container 698389061a1c0639c4a14f9bc8899bb37c8a7ac5dcc7f5526ddb59edd410f4c4: Status 404 returned error can't find the container with id 698389061a1c0639c4a14f9bc8899bb37c8a7ac5dcc7f5526ddb59edd410f4c4 Apr 16 20:12:55.042451 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:55.042429 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:12:55.463984 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:55.463890 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" event={"ID":"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf","Type":"ContainerStarted","Data":"71b4a74a456e6c084ce9ac6611b77387d29206b14749e4a85aa371d68fce7fa7"} Apr 16 20:12:55.463984 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:55.463930 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" event={"ID":"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf","Type":"ContainerStarted","Data":"698389061a1c0639c4a14f9bc8899bb37c8a7ac5dcc7f5526ddb59edd410f4c4"} Apr 16 20:12:55.463984 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:55.463966 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:12:55.481145 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:12:55.481101 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" podStartSLOduration=2.481087537 podStartE2EDuration="2.481087537s" podCreationTimestamp="2026-04-16 20:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:55.480499755 +0000 UTC m=+1135.156501832" watchObservedRunningTime="2026-04-16 20:12:55.481087537 +0000 UTC m=+1135.157089664" Apr 16 20:13:01.473788 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:01.473759 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:13:04.091758 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:04.091729 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk"] Apr 16 20:13:04.092338 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:04.091935 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" podUID="f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" containerName="ensemble-graph-de58a" containerID="cri-o://71b4a74a456e6c084ce9ac6611b77387d29206b14749e4a85aa371d68fce7fa7" gracePeriod=30 Apr 16 20:13:06.471983 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:06.471944 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" podUID="f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" containerName="ensemble-graph-de58a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:13:11.471806 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:11.471759 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" podUID="f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" containerName="ensemble-graph-de58a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:13:16.471775 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:16.471684 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" podUID="f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" containerName="ensemble-graph-de58a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:13:16.472236 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:16.471780 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:13:21.471661 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:21.471618 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" podUID="f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" containerName="ensemble-graph-de58a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:13:26.471102 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:26.471059 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" podUID="f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" containerName="ensemble-graph-de58a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:13:29.615979 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:29.615946 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th"] Apr 16 20:13:29.619274 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:29.619257 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:13:29.621995 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:29.621965 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-6b9e1-serving-cert\"" Apr 16 20:13:29.622120 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:29.621966 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-6b9e1-kube-rbac-proxy-sar-config\"" Apr 16 20:13:29.629543 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:29.629514 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th"] Apr 16 20:13:29.722361 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:29.722329 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-proxy-tls\") pod \"sequence-graph-6b9e1-58b69f49c8-6r5th\" (UID: \"8152a0ff-4f25-419f-95a8-b3c3967fdfcf\") " pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:13:29.722495 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:29.722374 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-openshift-service-ca-bundle\") pod \"sequence-graph-6b9e1-58b69f49c8-6r5th\" (UID: \"8152a0ff-4f25-419f-95a8-b3c3967fdfcf\") " pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:13:29.823494 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:29.823454 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-proxy-tls\") pod \"sequence-graph-6b9e1-58b69f49c8-6r5th\" (UID: \"8152a0ff-4f25-419f-95a8-b3c3967fdfcf\") " pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:13:29.823494 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:29.823505 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-openshift-service-ca-bundle\") pod \"sequence-graph-6b9e1-58b69f49c8-6r5th\" (UID: \"8152a0ff-4f25-419f-95a8-b3c3967fdfcf\") " pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:13:29.823670 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:13:29.823593 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-6b9e1-serving-cert: secret "sequence-graph-6b9e1-serving-cert" not found Apr 16 20:13:29.823670 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:13:29.823662 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-proxy-tls podName:8152a0ff-4f25-419f-95a8-b3c3967fdfcf nodeName:}" failed. No retries permitted until 2026-04-16 20:13:30.323646655 +0000 UTC m=+1169.999648714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-proxy-tls") pod "sequence-graph-6b9e1-58b69f49c8-6r5th" (UID: "8152a0ff-4f25-419f-95a8-b3c3967fdfcf") : secret "sequence-graph-6b9e1-serving-cert" not found Apr 16 20:13:29.824126 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:29.824109 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-openshift-service-ca-bundle\") pod \"sequence-graph-6b9e1-58b69f49c8-6r5th\" (UID: \"8152a0ff-4f25-419f-95a8-b3c3967fdfcf\") " pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:13:30.327648 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:30.327604 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-proxy-tls\") pod \"sequence-graph-6b9e1-58b69f49c8-6r5th\" (UID: \"8152a0ff-4f25-419f-95a8-b3c3967fdfcf\") " pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:13:30.329890 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:30.329862 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-proxy-tls\") pod \"sequence-graph-6b9e1-58b69f49c8-6r5th\" (UID: \"8152a0ff-4f25-419f-95a8-b3c3967fdfcf\") " pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:13:30.529625 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:30.529593 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:13:30.648018 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:30.647993 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th"] Apr 16 20:13:30.649883 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:13:30.649856 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8152a0ff_4f25_419f_95a8_b3c3967fdfcf.slice/crio-8c5d7f0a71c8153a956df41c19ee9f00f96559a7df77cc3b214c180f0ae6022e WatchSource:0}: Error finding container 8c5d7f0a71c8153a956df41c19ee9f00f96559a7df77cc3b214c180f0ae6022e: Status 404 returned error can't find the container with id 8c5d7f0a71c8153a956df41c19ee9f00f96559a7df77cc3b214c180f0ae6022e Apr 16 20:13:31.471301 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:31.471264 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" podUID="f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" containerName="ensemble-graph-de58a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:13:31.576186 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:31.576134 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" event={"ID":"8152a0ff-4f25-419f-95a8-b3c3967fdfcf","Type":"ContainerStarted","Data":"2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a"} Apr 16 20:13:31.576392 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:31.576192 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" event={"ID":"8152a0ff-4f25-419f-95a8-b3c3967fdfcf","Type":"ContainerStarted","Data":"8c5d7f0a71c8153a956df41c19ee9f00f96559a7df77cc3b214c180f0ae6022e"} Apr 16 20:13:31.576392 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:31.576215 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:13:31.594931 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:31.594889 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" podStartSLOduration=2.5948761879999998 podStartE2EDuration="2.594876188s" podCreationTimestamp="2026-04-16 20:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:13:31.592854153 +0000 UTC m=+1171.268856230" watchObservedRunningTime="2026-04-16 20:13:31.594876188 +0000 UTC m=+1171.270878264" Apr 16 20:13:34.236095 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.236068 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:13:34.263321 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.263295 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-proxy-tls\") pod \"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf\" (UID: \"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf\") " Apr 16 20:13:34.263445 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.263372 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-openshift-service-ca-bundle\") pod \"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf\" (UID: \"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf\") " Apr 16 20:13:34.263704 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.263680 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" (UID: "f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:13:34.265342 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.265315 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" (UID: "f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:13:34.364847 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.364781 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-proxy-tls\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:13:34.364847 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.364805 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf-openshift-service-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:13:34.585756 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.585722 2568 generic.go:358] "Generic (PLEG): container finished" podID="f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" containerID="71b4a74a456e6c084ce9ac6611b77387d29206b14749e4a85aa371d68fce7fa7" exitCode=0 Apr 16 20:13:34.585906 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.585785 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" Apr 16 20:13:34.585906 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.585807 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" event={"ID":"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf","Type":"ContainerDied","Data":"71b4a74a456e6c084ce9ac6611b77387d29206b14749e4a85aa371d68fce7fa7"} Apr 16 20:13:34.585906 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.585848 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk" event={"ID":"f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf","Type":"ContainerDied","Data":"698389061a1c0639c4a14f9bc8899bb37c8a7ac5dcc7f5526ddb59edd410f4c4"} Apr 16 20:13:34.585906 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.585871 2568 scope.go:117] "RemoveContainer" containerID="71b4a74a456e6c084ce9ac6611b77387d29206b14749e4a85aa371d68fce7fa7" Apr 16 20:13:34.594127 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.594107 2568 scope.go:117] "RemoveContainer" containerID="71b4a74a456e6c084ce9ac6611b77387d29206b14749e4a85aa371d68fce7fa7" Apr 16 20:13:34.594409 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:13:34.594392 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71b4a74a456e6c084ce9ac6611b77387d29206b14749e4a85aa371d68fce7fa7\": container with ID starting with 71b4a74a456e6c084ce9ac6611b77387d29206b14749e4a85aa371d68fce7fa7 not found: ID does not exist" containerID="71b4a74a456e6c084ce9ac6611b77387d29206b14749e4a85aa371d68fce7fa7" Apr 16 20:13:34.594472 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.594417 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b4a74a456e6c084ce9ac6611b77387d29206b14749e4a85aa371d68fce7fa7"} err="failed to get container status \"71b4a74a456e6c084ce9ac6611b77387d29206b14749e4a85aa371d68fce7fa7\": rpc error: code = NotFound desc = could not find container \"71b4a74a456e6c084ce9ac6611b77387d29206b14749e4a85aa371d68fce7fa7\": container with ID starting with 71b4a74a456e6c084ce9ac6611b77387d29206b14749e4a85aa371d68fce7fa7 not found: ID does not exist" Apr 16 20:13:34.607242 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.607221 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk"] Apr 16 20:13:34.612236 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.612217 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-de58a-7d7f775b78-25zfk"] Apr 16 20:13:34.897393 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:34.897367 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" path="/var/lib/kubelet/pods/f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf/volumes" Apr 16 20:13:37.584913 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:37.584881 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:13:39.687466 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:39.687425 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th"] Apr 16 20:13:39.687859 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:39.687668 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" podUID="8152a0ff-4f25-419f-95a8-b3c3967fdfcf" containerName="sequence-graph-6b9e1" containerID="cri-o://2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a" gracePeriod=30 Apr 16 20:13:42.583521 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:42.583485 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" podUID="8152a0ff-4f25-419f-95a8-b3c3967fdfcf" containerName="sequence-graph-6b9e1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:13:47.583157 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:47.583120 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" podUID="8152a0ff-4f25-419f-95a8-b3c3967fdfcf" containerName="sequence-graph-6b9e1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:13:52.583323 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:52.583285 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" podUID="8152a0ff-4f25-419f-95a8-b3c3967fdfcf" containerName="sequence-graph-6b9e1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:13:52.583771 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:52.583388 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:13:57.583151 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:13:57.583108 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" podUID="8152a0ff-4f25-419f-95a8-b3c3967fdfcf" containerName="sequence-graph-6b9e1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:14:02.582695 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:02.582652 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" podUID="8152a0ff-4f25-419f-95a8-b3c3967fdfcf" containerName="sequence-graph-6b9e1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:14:07.584040 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:07.583997 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" podUID="8152a0ff-4f25-419f-95a8-b3c3967fdfcf" containerName="sequence-graph-6b9e1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:14:09.715764 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:14:09.715732 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8152a0ff_4f25_419f_95a8_b3c3967fdfcf.slice/crio-2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8152a0ff_4f25_419f_95a8_b3c3967fdfcf.slice/crio-conmon-2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a.scope\": RecentStats: unable to find data in memory cache]" Apr 16 20:14:09.716080 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:14:09.715766 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8152a0ff_4f25_419f_95a8_b3c3967fdfcf.slice/crio-2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a.scope\": RecentStats: unable to find data in memory cache]" Apr 16 20:14:10.329700 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.329677 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:14:10.434329 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.434302 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-proxy-tls\") pod \"8152a0ff-4f25-419f-95a8-b3c3967fdfcf\" (UID: \"8152a0ff-4f25-419f-95a8-b3c3967fdfcf\") " Apr 16 20:14:10.434454 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.434383 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-openshift-service-ca-bundle\") pod \"8152a0ff-4f25-419f-95a8-b3c3967fdfcf\" (UID: \"8152a0ff-4f25-419f-95a8-b3c3967fdfcf\") " Apr 16 20:14:10.434748 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.434722 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "8152a0ff-4f25-419f-95a8-b3c3967fdfcf" (UID: "8152a0ff-4f25-419f-95a8-b3c3967fdfcf"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:14:10.436182 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.436148 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8152a0ff-4f25-419f-95a8-b3c3967fdfcf" (UID: "8152a0ff-4f25-419f-95a8-b3c3967fdfcf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:10.535416 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.535394 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-openshift-service-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:14:10.535416 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.535415 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8152a0ff-4f25-419f-95a8-b3c3967fdfcf-proxy-tls\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:14:10.693459 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.693425 2568 generic.go:358] "Generic (PLEG): container finished" podID="8152a0ff-4f25-419f-95a8-b3c3967fdfcf" containerID="2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a" exitCode=0 Apr 16 20:14:10.693578 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.693464 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" event={"ID":"8152a0ff-4f25-419f-95a8-b3c3967fdfcf","Type":"ContainerDied","Data":"2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a"} Apr 16 20:14:10.693578 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.693485 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" event={"ID":"8152a0ff-4f25-419f-95a8-b3c3967fdfcf","Type":"ContainerDied","Data":"8c5d7f0a71c8153a956df41c19ee9f00f96559a7df77cc3b214c180f0ae6022e"} Apr 16 20:14:10.693578 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.693503 2568 scope.go:117] "RemoveContainer" containerID="2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a" Apr 16 20:14:10.693578 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.693502 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th" Apr 16 20:14:10.703667 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.703649 2568 scope.go:117] "RemoveContainer" containerID="2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a" Apr 16 20:14:10.703918 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:14:10.703897 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a\": container with ID starting with 2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a not found: ID does not exist" containerID="2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a" Apr 16 20:14:10.703967 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.703926 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a"} err="failed to get container status \"2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a\": rpc error: code = NotFound desc = could not find container \"2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a\": container with ID starting with 2c31afa305b437c97f776e50bca33580a7983d26400dfb13db7c69c54176227a not found: ID does not exist" Apr 16 20:14:10.716423 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.716400 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th"] Apr 16 20:14:10.720527 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.720508 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6b9e1-58b69f49c8-6r5th"] Apr 16 20:14:10.897058 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:10.896993 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8152a0ff-4f25-419f-95a8-b3c3967fdfcf" path="/var/lib/kubelet/pods/8152a0ff-4f25-419f-95a8-b3c3967fdfcf/volumes" Apr 16 20:14:14.308076 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.308043 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92"] Apr 16 20:14:14.308471 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.308389 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" containerName="ensemble-graph-de58a" Apr 16 20:14:14.308471 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.308401 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" containerName="ensemble-graph-de58a" Apr 16 20:14:14.308471 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.308414 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8152a0ff-4f25-419f-95a8-b3c3967fdfcf" containerName="sequence-graph-6b9e1" Apr 16 20:14:14.308471 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.308419 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8152a0ff-4f25-419f-95a8-b3c3967fdfcf" containerName="sequence-graph-6b9e1" Apr 16 20:14:14.308605 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.308481 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="8152a0ff-4f25-419f-95a8-b3c3967fdfcf" containerName="sequence-graph-6b9e1" Apr 16 20:14:14.308605 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.308490 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f46f208b-213f-4c52-b9d4-b1c0bc3ae4cf" containerName="ensemble-graph-de58a" Apr 16 20:14:14.312625 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.312608 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" Apr 16 20:14:14.316099 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.316075 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-12b34-kube-rbac-proxy-sar-config\"" Apr 16 20:14:14.316234 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.316081 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-12b34-serving-cert\"" Apr 16 20:14:14.316234 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.316086 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:14:14.316234 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.316206 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jlhx9\"" Apr 16 20:14:14.319783 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.319652 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92"] Apr 16 20:14:14.466571 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.466543 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7ee9fe4-47e4-40a4-a49e-6a2a6d949994-openshift-service-ca-bundle\") pod \"ensemble-graph-12b34-9c57c45ff-5kq92\" (UID: \"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994\") " pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" Apr 16 20:14:14.466700 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.466609 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7ee9fe4-47e4-40a4-a49e-6a2a6d949994-proxy-tls\") pod \"ensemble-graph-12b34-9c57c45ff-5kq92\" (UID: \"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994\") " pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" Apr 16 20:14:14.567345 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.567277 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7ee9fe4-47e4-40a4-a49e-6a2a6d949994-proxy-tls\") pod \"ensemble-graph-12b34-9c57c45ff-5kq92\" (UID: \"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994\") " pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" Apr 16 20:14:14.567345 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.567329 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7ee9fe4-47e4-40a4-a49e-6a2a6d949994-openshift-service-ca-bundle\") pod \"ensemble-graph-12b34-9c57c45ff-5kq92\" (UID: \"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994\") " pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" Apr 16 20:14:14.567892 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.567873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7ee9fe4-47e4-40a4-a49e-6a2a6d949994-openshift-service-ca-bundle\") pod \"ensemble-graph-12b34-9c57c45ff-5kq92\" (UID: \"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994\") " pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" Apr 16 20:14:14.569522 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.569502 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7ee9fe4-47e4-40a4-a49e-6a2a6d949994-proxy-tls\") pod \"ensemble-graph-12b34-9c57c45ff-5kq92\" (UID: \"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994\") " pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" Apr 16 20:14:14.624576 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.624552 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" Apr 16 20:14:14.738448 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:14.738421 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92"] Apr 16 20:14:14.740583 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:14:14.740552 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7ee9fe4_47e4_40a4_a49e_6a2a6d949994.slice/crio-33ad8e80d2c882eda5aa52e460cf137d1d61b7ac6b34cde552472c630e9b4207 WatchSource:0}: Error finding container 33ad8e80d2c882eda5aa52e460cf137d1d61b7ac6b34cde552472c630e9b4207: Status 404 returned error can't find the container with id 33ad8e80d2c882eda5aa52e460cf137d1d61b7ac6b34cde552472c630e9b4207 Apr 16 20:14:15.709552 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:15.709519 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" event={"ID":"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994","Type":"ContainerStarted","Data":"66ad5f8a6af3e31a088a8543ec9850a803f19ced22cd783214f79e358b5f71ef"} Apr 16 20:14:15.709552 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:15.709554 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" event={"ID":"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994","Type":"ContainerStarted","Data":"33ad8e80d2c882eda5aa52e460cf137d1d61b7ac6b34cde552472c630e9b4207"} Apr 16 20:14:15.709964 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:15.709575 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" Apr 16 20:14:15.727561 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:15.727510 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" podStartSLOduration=1.727495911 podStartE2EDuration="1.727495911s" podCreationTimestamp="2026-04-16 20:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:15.726525138 +0000 UTC m=+1215.402527215" watchObservedRunningTime="2026-04-16 20:14:15.727495911 +0000 UTC m=+1215.403497988" Apr 16 20:14:21.718545 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:21.718518 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" Apr 16 20:14:49.923160 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:49.923080 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v"] Apr 16 20:14:49.926274 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:49.926254 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:14:49.928966 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:49.928944 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-1f9ad-serving-cert\"" Apr 16 20:14:49.929067 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:49.928948 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-1f9ad-kube-rbac-proxy-sar-config\"" Apr 16 20:14:49.931585 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:49.931502 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ff19e48-ce4e-4903-9ab5-5a820885deaa-openshift-service-ca-bundle\") pod \"sequence-graph-1f9ad-5789fb8557-kcq6v\" (UID: \"6ff19e48-ce4e-4903-9ab5-5a820885deaa\") " pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:14:49.931790 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:49.931770 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ff19e48-ce4e-4903-9ab5-5a820885deaa-proxy-tls\") pod \"sequence-graph-1f9ad-5789fb8557-kcq6v\" (UID: \"6ff19e48-ce4e-4903-9ab5-5a820885deaa\") " pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:14:49.933871 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:49.933843 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v"] Apr 16 20:14:50.033279 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:50.033244 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ff19e48-ce4e-4903-9ab5-5a820885deaa-openshift-service-ca-bundle\") pod \"sequence-graph-1f9ad-5789fb8557-kcq6v\" (UID: \"6ff19e48-ce4e-4903-9ab5-5a820885deaa\") " pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:14:50.033459 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:50.033291 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ff19e48-ce4e-4903-9ab5-5a820885deaa-proxy-tls\") pod \"sequence-graph-1f9ad-5789fb8557-kcq6v\" (UID: \"6ff19e48-ce4e-4903-9ab5-5a820885deaa\") " pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:14:50.033459 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:14:50.033400 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-1f9ad-serving-cert: secret "sequence-graph-1f9ad-serving-cert" not found Apr 16 20:14:50.033531 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:14:50.033461 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ff19e48-ce4e-4903-9ab5-5a820885deaa-proxy-tls podName:6ff19e48-ce4e-4903-9ab5-5a820885deaa nodeName:}" failed. No retries permitted until 2026-04-16 20:14:50.533444027 +0000 UTC m=+1250.209446080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6ff19e48-ce4e-4903-9ab5-5a820885deaa-proxy-tls") pod "sequence-graph-1f9ad-5789fb8557-kcq6v" (UID: "6ff19e48-ce4e-4903-9ab5-5a820885deaa") : secret "sequence-graph-1f9ad-serving-cert" not found Apr 16 20:14:50.033868 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:50.033848 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ff19e48-ce4e-4903-9ab5-5a820885deaa-openshift-service-ca-bundle\") pod \"sequence-graph-1f9ad-5789fb8557-kcq6v\" (UID: \"6ff19e48-ce4e-4903-9ab5-5a820885deaa\") " pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:14:50.537107 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:50.537075 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ff19e48-ce4e-4903-9ab5-5a820885deaa-proxy-tls\") pod \"sequence-graph-1f9ad-5789fb8557-kcq6v\" (UID: \"6ff19e48-ce4e-4903-9ab5-5a820885deaa\") " pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:14:50.539562 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:50.539539 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ff19e48-ce4e-4903-9ab5-5a820885deaa-proxy-tls\") pod \"sequence-graph-1f9ad-5789fb8557-kcq6v\" (UID: \"6ff19e48-ce4e-4903-9ab5-5a820885deaa\") " pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:14:50.837297 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:50.837194 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:14:50.958821 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:50.958799 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v"] Apr 16 20:14:50.960685 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:14:50.960654 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ff19e48_ce4e_4903_9ab5_5a820885deaa.slice/crio-f2825c23ac3695be21a507cb1fb5bd4af6cba98587c718032332a848b1ebfefd WatchSource:0}: Error finding container f2825c23ac3695be21a507cb1fb5bd4af6cba98587c718032332a848b1ebfefd: Status 404 returned error can't find the container with id f2825c23ac3695be21a507cb1fb5bd4af6cba98587c718032332a848b1ebfefd Apr 16 20:14:51.815533 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:51.815493 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" event={"ID":"6ff19e48-ce4e-4903-9ab5-5a820885deaa","Type":"ContainerStarted","Data":"f77241859bbe5d102b244c77d87911213ac25135dc352401d55ce3b242fdd699"} Apr 16 20:14:51.815533 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:51.815537 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" event={"ID":"6ff19e48-ce4e-4903-9ab5-5a820885deaa","Type":"ContainerStarted","Data":"f2825c23ac3695be21a507cb1fb5bd4af6cba98587c718032332a848b1ebfefd"} Apr 16 20:14:51.815757 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:51.815569 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:14:51.832397 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:51.832351 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" podStartSLOduration=2.832336544 podStartE2EDuration="2.832336544s" podCreationTimestamp="2026-04-16 20:14:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:51.830874644 +0000 UTC m=+1251.506876719" watchObservedRunningTime="2026-04-16 20:14:51.832336544 +0000 UTC m=+1251.508338620" Apr 16 20:14:57.824652 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:14:57.824620 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:22:28.954235 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:28.954149 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92"] Apr 16 20:22:28.956529 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:28.954404 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" podUID="d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" containerName="ensemble-graph-12b34" containerID="cri-o://66ad5f8a6af3e31a088a8543ec9850a803f19ced22cd783214f79e358b5f71ef" gracePeriod=30 Apr 16 20:22:31.717165 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:31.717131 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" podUID="d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" containerName="ensemble-graph-12b34" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:36.716516 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:36.716478 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" podUID="d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" containerName="ensemble-graph-12b34" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:41.717162 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:41.717127 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" podUID="d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" containerName="ensemble-graph-12b34" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:41.717566 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:41.717257 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" Apr 16 20:22:46.716795 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:46.716751 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" podUID="d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" containerName="ensemble-graph-12b34" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:51.716973 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:51.716930 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" podUID="d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" containerName="ensemble-graph-12b34" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:56.716564 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:56.716528 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" podUID="d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" containerName="ensemble-graph-12b34" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:59.235461 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:59.235385 2568 generic.go:358] "Generic (PLEG): container finished" podID="d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" containerID="66ad5f8a6af3e31a088a8543ec9850a803f19ced22cd783214f79e358b5f71ef" exitCode=0 Apr 16 20:22:59.235787 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:59.235461 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" event={"ID":"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994","Type":"ContainerDied","Data":"66ad5f8a6af3e31a088a8543ec9850a803f19ced22cd783214f79e358b5f71ef"} Apr 16 20:22:59.584163 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:59.584139 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" Apr 16 20:22:59.701580 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:59.701547 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7ee9fe4-47e4-40a4-a49e-6a2a6d949994-openshift-service-ca-bundle\") pod \"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994\" (UID: \"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994\") " Apr 16 20:22:59.701772 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:59.701635 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7ee9fe4-47e4-40a4-a49e-6a2a6d949994-proxy-tls\") pod \"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994\" (UID: \"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994\") " Apr 16 20:22:59.701916 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:59.701890 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7ee9fe4-47e4-40a4-a49e-6a2a6d949994-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" (UID: "d7ee9fe4-47e4-40a4-a49e-6a2a6d949994"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:22:59.703661 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:59.703638 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ee9fe4-47e4-40a4-a49e-6a2a6d949994-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" (UID: "d7ee9fe4-47e4-40a4-a49e-6a2a6d949994"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:22:59.802686 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:59.802613 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7ee9fe4-47e4-40a4-a49e-6a2a6d949994-openshift-service-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:22:59.802686 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:22:59.802638 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7ee9fe4-47e4-40a4-a49e-6a2a6d949994-proxy-tls\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:23:00.239408 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:00.239330 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" Apr 16 20:23:00.239408 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:00.239340 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92" event={"ID":"d7ee9fe4-47e4-40a4-a49e-6a2a6d949994","Type":"ContainerDied","Data":"33ad8e80d2c882eda5aa52e460cf137d1d61b7ac6b34cde552472c630e9b4207"} Apr 16 20:23:00.239408 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:00.239388 2568 scope.go:117] "RemoveContainer" containerID="66ad5f8a6af3e31a088a8543ec9850a803f19ced22cd783214f79e358b5f71ef" Apr 16 20:23:00.260605 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:00.260577 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92"] Apr 16 20:23:00.263686 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:00.263666 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-12b34-9c57c45ff-5kq92"] Apr 16 20:23:00.896931 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:00.896893 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" path="/var/lib/kubelet/pods/d7ee9fe4-47e4-40a4-a49e-6a2a6d949994/volumes" Apr 16 20:23:04.625717 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:04.625682 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v"] Apr 16 20:23:04.626105 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:04.625938 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" podUID="6ff19e48-ce4e-4903-9ab5-5a820885deaa" containerName="sequence-graph-1f9ad" containerID="cri-o://f77241859bbe5d102b244c77d87911213ac25135dc352401d55ce3b242fdd699" gracePeriod=30 Apr 16 20:23:07.822591 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:07.822549 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" podUID="6ff19e48-ce4e-4903-9ab5-5a820885deaa" containerName="sequence-graph-1f9ad" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:23:12.823276 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:12.823236 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" podUID="6ff19e48-ce4e-4903-9ab5-5a820885deaa" containerName="sequence-graph-1f9ad" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:23:17.823251 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:17.823203 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" podUID="6ff19e48-ce4e-4903-9ab5-5a820885deaa" containerName="sequence-graph-1f9ad" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:23:17.823637 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:17.823336 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:23:22.823049 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:22.823012 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" podUID="6ff19e48-ce4e-4903-9ab5-5a820885deaa" containerName="sequence-graph-1f9ad" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:23:27.823487 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:27.823448 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" podUID="6ff19e48-ce4e-4903-9ab5-5a820885deaa" containerName="sequence-graph-1f9ad" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:23:32.822709 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:32.822670 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" podUID="6ff19e48-ce4e-4903-9ab5-5a820885deaa" containerName="sequence-graph-1f9ad" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:23:34.774089 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:34.774064 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:23:34.777884 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:34.777868 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ff19e48-ce4e-4903-9ab5-5a820885deaa-openshift-service-ca-bundle\") pod \"6ff19e48-ce4e-4903-9ab5-5a820885deaa\" (UID: \"6ff19e48-ce4e-4903-9ab5-5a820885deaa\") " Apr 16 20:23:34.778062 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:34.778049 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ff19e48-ce4e-4903-9ab5-5a820885deaa-proxy-tls\") pod \"6ff19e48-ce4e-4903-9ab5-5a820885deaa\" (UID: \"6ff19e48-ce4e-4903-9ab5-5a820885deaa\") " Apr 16 20:23:34.778206 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:34.778164 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ff19e48-ce4e-4903-9ab5-5a820885deaa-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "6ff19e48-ce4e-4903-9ab5-5a820885deaa" (UID: "6ff19e48-ce4e-4903-9ab5-5a820885deaa"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:23:34.778271 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:34.778254 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ff19e48-ce4e-4903-9ab5-5a820885deaa-openshift-service-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:23:34.779880 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:34.779857 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ff19e48-ce4e-4903-9ab5-5a820885deaa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6ff19e48-ce4e-4903-9ab5-5a820885deaa" (UID: "6ff19e48-ce4e-4903-9ab5-5a820885deaa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:23:34.879226 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:34.879120 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ff19e48-ce4e-4903-9ab5-5a820885deaa-proxy-tls\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:23:35.345564 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:35.345529 2568 generic.go:358] "Generic (PLEG): container finished" podID="6ff19e48-ce4e-4903-9ab5-5a820885deaa" containerID="f77241859bbe5d102b244c77d87911213ac25135dc352401d55ce3b242fdd699" exitCode=0 Apr 16 20:23:35.345738 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:35.345587 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" Apr 16 20:23:35.345738 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:35.345588 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" event={"ID":"6ff19e48-ce4e-4903-9ab5-5a820885deaa","Type":"ContainerDied","Data":"f77241859bbe5d102b244c77d87911213ac25135dc352401d55ce3b242fdd699"} Apr 16 20:23:35.345738 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:35.345628 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v" event={"ID":"6ff19e48-ce4e-4903-9ab5-5a820885deaa","Type":"ContainerDied","Data":"f2825c23ac3695be21a507cb1fb5bd4af6cba98587c718032332a848b1ebfefd"} Apr 16 20:23:35.345738 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:35.345644 2568 scope.go:117] "RemoveContainer" containerID="f77241859bbe5d102b244c77d87911213ac25135dc352401d55ce3b242fdd699" Apr 16 20:23:35.353753 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:35.353736 2568 scope.go:117] "RemoveContainer" containerID="f77241859bbe5d102b244c77d87911213ac25135dc352401d55ce3b242fdd699" Apr 16 20:23:35.354001 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:23:35.353983 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f77241859bbe5d102b244c77d87911213ac25135dc352401d55ce3b242fdd699\": container with ID starting with f77241859bbe5d102b244c77d87911213ac25135dc352401d55ce3b242fdd699 not found: ID does not exist" containerID="f77241859bbe5d102b244c77d87911213ac25135dc352401d55ce3b242fdd699" Apr 16 20:23:35.354055 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:35.354011 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f77241859bbe5d102b244c77d87911213ac25135dc352401d55ce3b242fdd699"} err="failed to get container status \"f77241859bbe5d102b244c77d87911213ac25135dc352401d55ce3b242fdd699\": rpc error: code = NotFound desc = could not find container \"f77241859bbe5d102b244c77d87911213ac25135dc352401d55ce3b242fdd699\": container with ID starting with f77241859bbe5d102b244c77d87911213ac25135dc352401d55ce3b242fdd699 not found: ID does not exist" Apr 16 20:23:35.361445 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:35.361424 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v"] Apr 16 20:23:35.365480 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:35.365460 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1f9ad-5789fb8557-kcq6v"] Apr 16 20:23:36.897117 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:36.897085 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ff19e48-ce4e-4903-9ab5-5a820885deaa" path="/var/lib/kubelet/pods/6ff19e48-ce4e-4903-9ab5-5a820885deaa/volumes" Apr 16 20:23:39.207531 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.207456 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2"] Apr 16 20:23:39.207887 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.207765 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" containerName="ensemble-graph-12b34" Apr 16 20:23:39.207887 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.207777 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" containerName="ensemble-graph-12b34" Apr 16 20:23:39.207887 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.207793 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ff19e48-ce4e-4903-9ab5-5a820885deaa" containerName="sequence-graph-1f9ad" Apr 16 20:23:39.207887 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.207799 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff19e48-ce4e-4903-9ab5-5a820885deaa" containerName="sequence-graph-1f9ad" Apr 16 20:23:39.207887 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.207841 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ff19e48-ce4e-4903-9ab5-5a820885deaa" containerName="sequence-graph-1f9ad" Apr 16 20:23:39.207887 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.207849 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7ee9fe4-47e4-40a4-a49e-6a2a6d949994" containerName="ensemble-graph-12b34" Apr 16 20:23:39.211944 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.211927 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:23:39.214549 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.214521 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:23:39.214677 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.214523 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-e8b6e-serving-cert\"" Apr 16 20:23:39.214677 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.214625 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-e8b6e-kube-rbac-proxy-sar-config\"" Apr 16 20:23:39.214853 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.214835 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-jlhx9\"" Apr 16 20:23:39.218427 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.218406 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2"] Apr 16 20:23:39.310549 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.310519 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-proxy-tls\") pod \"splitter-graph-e8b6e-5f9c98cc85-w42z2\" (UID: \"cfcdf39b-e212-4898-85fa-06d2ecb7df5a\") " pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:23:39.310713 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.310568 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-openshift-service-ca-bundle\") pod \"splitter-graph-e8b6e-5f9c98cc85-w42z2\" (UID: \"cfcdf39b-e212-4898-85fa-06d2ecb7df5a\") " pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:23:39.411107 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.411079 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-proxy-tls\") pod \"splitter-graph-e8b6e-5f9c98cc85-w42z2\" (UID: \"cfcdf39b-e212-4898-85fa-06d2ecb7df5a\") " pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:23:39.411260 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.411129 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-openshift-service-ca-bundle\") pod \"splitter-graph-e8b6e-5f9c98cc85-w42z2\" (UID: \"cfcdf39b-e212-4898-85fa-06d2ecb7df5a\") " pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:23:39.411260 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:23:39.411237 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-e8b6e-serving-cert: secret "splitter-graph-e8b6e-serving-cert" not found Apr 16 20:23:39.411343 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:23:39.411303 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-proxy-tls podName:cfcdf39b-e212-4898-85fa-06d2ecb7df5a nodeName:}" failed. No retries permitted until 2026-04-16 20:23:39.911286394 +0000 UTC m=+1779.587288447 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-proxy-tls") pod "splitter-graph-e8b6e-5f9c98cc85-w42z2" (UID: "cfcdf39b-e212-4898-85fa-06d2ecb7df5a") : secret "splitter-graph-e8b6e-serving-cert" not found Apr 16 20:23:39.411720 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.411703 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-openshift-service-ca-bundle\") pod \"splitter-graph-e8b6e-5f9c98cc85-w42z2\" (UID: \"cfcdf39b-e212-4898-85fa-06d2ecb7df5a\") " pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:23:39.915346 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.915313 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-proxy-tls\") pod \"splitter-graph-e8b6e-5f9c98cc85-w42z2\" (UID: \"cfcdf39b-e212-4898-85fa-06d2ecb7df5a\") " pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:23:39.917656 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:39.917629 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-proxy-tls\") pod \"splitter-graph-e8b6e-5f9c98cc85-w42z2\" (UID: \"cfcdf39b-e212-4898-85fa-06d2ecb7df5a\") " pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:23:40.123410 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:40.123381 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:23:40.239420 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:40.239395 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2"] Apr 16 20:23:40.241101 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:23:40.241068 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfcdf39b_e212_4898_85fa_06d2ecb7df5a.slice/crio-531976d699440a1e7a8bdcbd11644d1018957e6a509d5291ca760058cfb107d5 WatchSource:0}: Error finding container 531976d699440a1e7a8bdcbd11644d1018957e6a509d5291ca760058cfb107d5: Status 404 returned error can't find the container with id 531976d699440a1e7a8bdcbd11644d1018957e6a509d5291ca760058cfb107d5 Apr 16 20:23:40.242821 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:40.242805 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:23:40.363838 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:40.363800 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" event={"ID":"cfcdf39b-e212-4898-85fa-06d2ecb7df5a","Type":"ContainerStarted","Data":"b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12"} Apr 16 20:23:40.363838 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:40.363840 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" event={"ID":"cfcdf39b-e212-4898-85fa-06d2ecb7df5a","Type":"ContainerStarted","Data":"531976d699440a1e7a8bdcbd11644d1018957e6a509d5291ca760058cfb107d5"} Apr 16 20:23:40.364030 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:40.363915 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:23:40.380709 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:40.380656 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" podStartSLOduration=1.380638175 podStartE2EDuration="1.380638175s" podCreationTimestamp="2026-04-16 20:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:23:40.378689253 +0000 UTC m=+1780.054691342" watchObservedRunningTime="2026-04-16 20:23:40.380638175 +0000 UTC m=+1780.056640252" Apr 16 20:23:46.372550 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:46.372480 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:23:49.284293 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:49.284257 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2"] Apr 16 20:23:49.284663 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:49.284468 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" podUID="cfcdf39b-e212-4898-85fa-06d2ecb7df5a" containerName="splitter-graph-e8b6e" containerID="cri-o://b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12" gracePeriod=30 Apr 16 20:23:51.371346 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:51.371309 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" podUID="cfcdf39b-e212-4898-85fa-06d2ecb7df5a" containerName="splitter-graph-e8b6e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:23:56.370758 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:23:56.370725 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" podUID="cfcdf39b-e212-4898-85fa-06d2ecb7df5a" containerName="splitter-graph-e8b6e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:24:01.370587 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:01.370549 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" podUID="cfcdf39b-e212-4898-85fa-06d2ecb7df5a" containerName="splitter-graph-e8b6e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:24:01.370953 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:01.370655 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:24:06.370733 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:06.370691 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" podUID="cfcdf39b-e212-4898-85fa-06d2ecb7df5a" containerName="splitter-graph-e8b6e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:24:11.370529 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:11.370487 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" podUID="cfcdf39b-e212-4898-85fa-06d2ecb7df5a" containerName="splitter-graph-e8b6e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:24:14.839542 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:14.839503 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm"] Apr 16 20:24:14.843898 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:14.843876 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:24:14.846408 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:14.846379 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-653df-serving-cert\"" Apr 16 20:24:14.846518 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:14.846472 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-653df-kube-rbac-proxy-sar-config\"" Apr 16 20:24:14.849907 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:14.849885 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm"] Apr 16 20:24:14.975852 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:14.975821 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53913be0-8d2a-4426-87ed-b208eb235786-openshift-service-ca-bundle\") pod \"switch-graph-653df-fd48d9cd8-p5dbm\" (UID: \"53913be0-8d2a-4426-87ed-b208eb235786\") " pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:24:14.975990 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:14.975963 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53913be0-8d2a-4426-87ed-b208eb235786-proxy-tls\") pod \"switch-graph-653df-fd48d9cd8-p5dbm\" (UID: \"53913be0-8d2a-4426-87ed-b208eb235786\") " pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:24:15.076774 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:15.076747 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53913be0-8d2a-4426-87ed-b208eb235786-proxy-tls\") pod \"switch-graph-653df-fd48d9cd8-p5dbm\" (UID: \"53913be0-8d2a-4426-87ed-b208eb235786\") " pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:24:15.076928 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:15.076799 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53913be0-8d2a-4426-87ed-b208eb235786-openshift-service-ca-bundle\") pod \"switch-graph-653df-fd48d9cd8-p5dbm\" (UID: \"53913be0-8d2a-4426-87ed-b208eb235786\") " pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:24:15.076928 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:24:15.076893 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-653df-serving-cert: secret "switch-graph-653df-serving-cert" not found Apr 16 20:24:15.077016 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:24:15.076965 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53913be0-8d2a-4426-87ed-b208eb235786-proxy-tls podName:53913be0-8d2a-4426-87ed-b208eb235786 nodeName:}" failed. No retries permitted until 2026-04-16 20:24:15.576949538 +0000 UTC m=+1815.252951592 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/53913be0-8d2a-4426-87ed-b208eb235786-proxy-tls") pod "switch-graph-653df-fd48d9cd8-p5dbm" (UID: "53913be0-8d2a-4426-87ed-b208eb235786") : secret "switch-graph-653df-serving-cert" not found Apr 16 20:24:15.077364 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:15.077347 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53913be0-8d2a-4426-87ed-b208eb235786-openshift-service-ca-bundle\") pod \"switch-graph-653df-fd48d9cd8-p5dbm\" (UID: \"53913be0-8d2a-4426-87ed-b208eb235786\") " pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:24:15.581483 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:15.581451 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53913be0-8d2a-4426-87ed-b208eb235786-proxy-tls\") pod \"switch-graph-653df-fd48d9cd8-p5dbm\" (UID: \"53913be0-8d2a-4426-87ed-b208eb235786\") " pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:24:15.583782 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:15.583753 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53913be0-8d2a-4426-87ed-b208eb235786-proxy-tls\") pod \"switch-graph-653df-fd48d9cd8-p5dbm\" (UID: \"53913be0-8d2a-4426-87ed-b208eb235786\") " pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:24:15.754708 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:15.754666 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:24:15.871427 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:15.871346 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm"] Apr 16 20:24:16.370645 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:16.370613 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" podUID="cfcdf39b-e212-4898-85fa-06d2ecb7df5a" containerName="splitter-graph-e8b6e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:24:16.472911 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:16.472884 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" event={"ID":"53913be0-8d2a-4426-87ed-b208eb235786","Type":"ContainerStarted","Data":"0cdcd68b5492d1e52019ff5517212bd5fc3438bf83432ecc945ea0ffad190bc2"} Apr 16 20:24:16.472911 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:16.472916 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" event={"ID":"53913be0-8d2a-4426-87ed-b208eb235786","Type":"ContainerStarted","Data":"e99ecfd1a8083b94b7e00df0a8e61f8db21e00d1734c3de9859dffa93d655f10"} Apr 16 20:24:16.473100 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:16.473009 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:24:16.489122 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:16.489078 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" podStartSLOduration=2.4890648349999998 podStartE2EDuration="2.489064835s" podCreationTimestamp="2026-04-16 20:24:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:24:16.487557407 +0000 UTC m=+1816.163559486" watchObservedRunningTime="2026-04-16 20:24:16.489064835 +0000 UTC m=+1816.165066911" Apr 16 20:24:19.314435 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:24:19.314408 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfcdf39b_e212_4898_85fa_06d2ecb7df5a.slice/crio-b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12.scope\": RecentStats: unable to find data in memory cache]" Apr 16 20:24:19.314818 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:24:19.314541 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfcdf39b_e212_4898_85fa_06d2ecb7df5a.slice/crio-conmon-b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12.scope\": RecentStats: unable to find data in memory cache]" Apr 16 20:24:19.424707 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.424685 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:24:19.483268 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.483237 2568 generic.go:358] "Generic (PLEG): container finished" podID="cfcdf39b-e212-4898-85fa-06d2ecb7df5a" containerID="b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12" exitCode=0 Apr 16 20:24:19.483383 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.483277 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" event={"ID":"cfcdf39b-e212-4898-85fa-06d2ecb7df5a","Type":"ContainerDied","Data":"b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12"} Apr 16 20:24:19.483383 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.483292 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" Apr 16 20:24:19.483383 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.483307 2568 scope.go:117] "RemoveContainer" containerID="b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12" Apr 16 20:24:19.483383 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.483298 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2" event={"ID":"cfcdf39b-e212-4898-85fa-06d2ecb7df5a","Type":"ContainerDied","Data":"531976d699440a1e7a8bdcbd11644d1018957e6a509d5291ca760058cfb107d5"} Apr 16 20:24:19.490545 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.490532 2568 scope.go:117] "RemoveContainer" containerID="b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12" Apr 16 20:24:19.490785 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:24:19.490766 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12\": container with ID starting with b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12 not found: ID does not exist" containerID="b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12" Apr 16 20:24:19.490840 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.490793 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12"} err="failed to get container status \"b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12\": rpc error: code = NotFound desc = could not find container \"b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12\": container with ID starting with b11bb9e92b58679a610f322a52a157cd2f58dbf608ec121081117a572f119b12 not found: ID does not exist" Apr 16 20:24:19.511471 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.511450 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-proxy-tls\") pod \"cfcdf39b-e212-4898-85fa-06d2ecb7df5a\" (UID: \"cfcdf39b-e212-4898-85fa-06d2ecb7df5a\") " Apr 16 20:24:19.511556 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.511542 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-openshift-service-ca-bundle\") pod \"cfcdf39b-e212-4898-85fa-06d2ecb7df5a\" (UID: \"cfcdf39b-e212-4898-85fa-06d2ecb7df5a\") " Apr 16 20:24:19.511847 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.511828 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "cfcdf39b-e212-4898-85fa-06d2ecb7df5a" (UID: "cfcdf39b-e212-4898-85fa-06d2ecb7df5a"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:24:19.513375 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.513358 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cfcdf39b-e212-4898-85fa-06d2ecb7df5a" (UID: "cfcdf39b-e212-4898-85fa-06d2ecb7df5a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:24:19.612992 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.612929 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-proxy-tls\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:24:19.612992 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.612961 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfcdf39b-e212-4898-85fa-06d2ecb7df5a-openshift-service-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:24:19.803057 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.803027 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2"] Apr 16 20:24:19.806461 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:19.806439 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e8b6e-5f9c98cc85-w42z2"] Apr 16 20:24:20.896678 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:20.896646 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfcdf39b-e212-4898-85fa-06d2ecb7df5a" path="/var/lib/kubelet/pods/cfcdf39b-e212-4898-85fa-06d2ecb7df5a/volumes" Apr 16 20:24:22.489242 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:22.489209 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:24:59.507136 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:59.507098 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj"] Apr 16 20:24:59.507572 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:59.507475 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfcdf39b-e212-4898-85fa-06d2ecb7df5a" containerName="splitter-graph-e8b6e" Apr 16 20:24:59.507572 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:59.507488 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcdf39b-e212-4898-85fa-06d2ecb7df5a" containerName="splitter-graph-e8b6e" Apr 16 20:24:59.507649 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:59.507572 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfcdf39b-e212-4898-85fa-06d2ecb7df5a" containerName="splitter-graph-e8b6e" Apr 16 20:24:59.510595 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:59.510576 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:24:59.512894 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:59.512874 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-9a649-kube-rbac-proxy-sar-config\"" Apr 16 20:24:59.513001 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:59.512930 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-9a649-serving-cert\"" Apr 16 20:24:59.516294 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:59.516272 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865d281a-1fd9-46f7-82fa-d4451e002661-proxy-tls\") pod \"splitter-graph-9a649-6c6f5d7c86-mqbhj\" (UID: \"865d281a-1fd9-46f7-82fa-d4451e002661\") " pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:24:59.516385 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:59.516351 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/865d281a-1fd9-46f7-82fa-d4451e002661-openshift-service-ca-bundle\") pod \"splitter-graph-9a649-6c6f5d7c86-mqbhj\" (UID: \"865d281a-1fd9-46f7-82fa-d4451e002661\") " pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:24:59.520506 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:59.520484 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj"] Apr 16 20:24:59.617412 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:59.617378 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865d281a-1fd9-46f7-82fa-d4451e002661-proxy-tls\") pod \"splitter-graph-9a649-6c6f5d7c86-mqbhj\" (UID: \"865d281a-1fd9-46f7-82fa-d4451e002661\") " pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:24:59.617563 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:59.617450 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/865d281a-1fd9-46f7-82fa-d4451e002661-openshift-service-ca-bundle\") pod \"splitter-graph-9a649-6c6f5d7c86-mqbhj\" (UID: \"865d281a-1fd9-46f7-82fa-d4451e002661\") " pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:24:59.617563 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:24:59.617547 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-9a649-serving-cert: secret "splitter-graph-9a649-serving-cert" not found Apr 16 20:24:59.617637 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:24:59.617608 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/865d281a-1fd9-46f7-82fa-d4451e002661-proxy-tls podName:865d281a-1fd9-46f7-82fa-d4451e002661 nodeName:}" failed. No retries permitted until 2026-04-16 20:25:00.117592193 +0000 UTC m=+1859.793594252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/865d281a-1fd9-46f7-82fa-d4451e002661-proxy-tls") pod "splitter-graph-9a649-6c6f5d7c86-mqbhj" (UID: "865d281a-1fd9-46f7-82fa-d4451e002661") : secret "splitter-graph-9a649-serving-cert" not found Apr 16 20:24:59.618064 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:24:59.618038 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/865d281a-1fd9-46f7-82fa-d4451e002661-openshift-service-ca-bundle\") pod \"splitter-graph-9a649-6c6f5d7c86-mqbhj\" (UID: \"865d281a-1fd9-46f7-82fa-d4451e002661\") " pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:25:00.120572 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:25:00.120538 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865d281a-1fd9-46f7-82fa-d4451e002661-proxy-tls\") pod \"splitter-graph-9a649-6c6f5d7c86-mqbhj\" (UID: \"865d281a-1fd9-46f7-82fa-d4451e002661\") " pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:25:00.122969 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:25:00.122946 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865d281a-1fd9-46f7-82fa-d4451e002661-proxy-tls\") pod \"splitter-graph-9a649-6c6f5d7c86-mqbhj\" (UID: \"865d281a-1fd9-46f7-82fa-d4451e002661\") " pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:25:00.421156 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:25:00.421082 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:25:00.536370 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:25:00.536340 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj"] Apr 16 20:25:00.538752 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:25:00.538722 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865d281a_1fd9_46f7_82fa_d4451e002661.slice/crio-04ac60e823047647c753f5b8179420fbfb6809b0b959e3cdf71554d661f57f99 WatchSource:0}: Error finding container 04ac60e823047647c753f5b8179420fbfb6809b0b959e3cdf71554d661f57f99: Status 404 returned error can't find the container with id 04ac60e823047647c753f5b8179420fbfb6809b0b959e3cdf71554d661f57f99 Apr 16 20:25:00.606635 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:25:00.606606 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" event={"ID":"865d281a-1fd9-46f7-82fa-d4451e002661","Type":"ContainerStarted","Data":"aa03da70e073a8c267d7a40f5219b9af81f9da27dd2898e83c5ebb408b19debd"} Apr 16 20:25:00.606756 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:25:00.606643 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" event={"ID":"865d281a-1fd9-46f7-82fa-d4451e002661","Type":"ContainerStarted","Data":"04ac60e823047647c753f5b8179420fbfb6809b0b959e3cdf71554d661f57f99"} Apr 16 20:25:00.606756 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:25:00.606707 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:25:00.635713 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:25:00.635674 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" podStartSLOduration=1.635660236 podStartE2EDuration="1.635660236s" podCreationTimestamp="2026-04-16 20:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:25:00.634352767 +0000 UTC m=+1860.310354843" watchObservedRunningTime="2026-04-16 20:25:00.635660236 +0000 UTC m=+1860.311662336" Apr 16 20:25:06.616689 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:25:06.616661 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:33:14.228622 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:14.228592 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj"] Apr 16 20:33:14.231099 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:14.228818 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" podUID="865d281a-1fd9-46f7-82fa-d4451e002661" containerName="splitter-graph-9a649" containerID="cri-o://aa03da70e073a8c267d7a40f5219b9af81f9da27dd2898e83c5ebb408b19debd" gracePeriod=30 Apr 16 20:33:16.615188 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:16.615135 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" podUID="865d281a-1fd9-46f7-82fa-d4451e002661" containerName="splitter-graph-9a649" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:33:21.614830 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:21.614794 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" podUID="865d281a-1fd9-46f7-82fa-d4451e002661" containerName="splitter-graph-9a649" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:33:26.615204 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:26.615146 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" podUID="865d281a-1fd9-46f7-82fa-d4451e002661" containerName="splitter-graph-9a649" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:33:26.615631 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:26.615280 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:33:31.614939 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:31.614902 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" podUID="865d281a-1fd9-46f7-82fa-d4451e002661" containerName="splitter-graph-9a649" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:33:36.614466 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:36.614430 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" podUID="865d281a-1fd9-46f7-82fa-d4451e002661" containerName="splitter-graph-9a649" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:33:41.615288 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:41.615256 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" podUID="865d281a-1fd9-46f7-82fa-d4451e002661" containerName="splitter-graph-9a649" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:33:44.403674 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:44.403650 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:33:44.602138 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:44.602108 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865d281a-1fd9-46f7-82fa-d4451e002661-proxy-tls\") pod \"865d281a-1fd9-46f7-82fa-d4451e002661\" (UID: \"865d281a-1fd9-46f7-82fa-d4451e002661\") " Apr 16 20:33:44.602337 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:44.602202 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/865d281a-1fd9-46f7-82fa-d4451e002661-openshift-service-ca-bundle\") pod \"865d281a-1fd9-46f7-82fa-d4451e002661\" (UID: \"865d281a-1fd9-46f7-82fa-d4451e002661\") " Apr 16 20:33:44.602572 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:44.602547 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/865d281a-1fd9-46f7-82fa-d4451e002661-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "865d281a-1fd9-46f7-82fa-d4451e002661" (UID: "865d281a-1fd9-46f7-82fa-d4451e002661"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:33:44.604135 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:44.604111 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865d281a-1fd9-46f7-82fa-d4451e002661-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "865d281a-1fd9-46f7-82fa-d4451e002661" (UID: "865d281a-1fd9-46f7-82fa-d4451e002661"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:33:44.703222 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:44.703193 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865d281a-1fd9-46f7-82fa-d4451e002661-proxy-tls\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:33:44.703222 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:44.703218 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/865d281a-1fd9-46f7-82fa-d4451e002661-openshift-service-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:33:45.131853 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:45.131817 2568 generic.go:358] "Generic (PLEG): container finished" podID="865d281a-1fd9-46f7-82fa-d4451e002661" containerID="aa03da70e073a8c267d7a40f5219b9af81f9da27dd2898e83c5ebb408b19debd" exitCode=0 Apr 16 20:33:45.132020 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:45.131888 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" Apr 16 20:33:45.132020 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:45.131887 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" event={"ID":"865d281a-1fd9-46f7-82fa-d4451e002661","Type":"ContainerDied","Data":"aa03da70e073a8c267d7a40f5219b9af81f9da27dd2898e83c5ebb408b19debd"} Apr 16 20:33:45.132020 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:45.131930 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj" event={"ID":"865d281a-1fd9-46f7-82fa-d4451e002661","Type":"ContainerDied","Data":"04ac60e823047647c753f5b8179420fbfb6809b0b959e3cdf71554d661f57f99"} Apr 16 20:33:45.132020 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:45.131951 2568 scope.go:117] "RemoveContainer" containerID="aa03da70e073a8c267d7a40f5219b9af81f9da27dd2898e83c5ebb408b19debd" Apr 16 20:33:45.139732 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:45.139713 2568 scope.go:117] "RemoveContainer" containerID="aa03da70e073a8c267d7a40f5219b9af81f9da27dd2898e83c5ebb408b19debd" Apr 16 20:33:45.139985 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:33:45.139968 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa03da70e073a8c267d7a40f5219b9af81f9da27dd2898e83c5ebb408b19debd\": container with ID starting with aa03da70e073a8c267d7a40f5219b9af81f9da27dd2898e83c5ebb408b19debd not found: ID does not exist" containerID="aa03da70e073a8c267d7a40f5219b9af81f9da27dd2898e83c5ebb408b19debd" Apr 16 20:33:45.140040 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:45.139995 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa03da70e073a8c267d7a40f5219b9af81f9da27dd2898e83c5ebb408b19debd"} err="failed to get container status \"aa03da70e073a8c267d7a40f5219b9af81f9da27dd2898e83c5ebb408b19debd\": rpc error: code = NotFound desc = could not find container \"aa03da70e073a8c267d7a40f5219b9af81f9da27dd2898e83c5ebb408b19debd\": container with ID starting with aa03da70e073a8c267d7a40f5219b9af81f9da27dd2898e83c5ebb408b19debd not found: ID does not exist" Apr 16 20:33:45.148210 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:45.148189 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj"] Apr 16 20:33:45.151929 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:45.151898 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9a649-6c6f5d7c86-mqbhj"] Apr 16 20:33:46.898155 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:33:46.898123 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="865d281a-1fd9-46f7-82fa-d4451e002661" path="/var/lib/kubelet/pods/865d281a-1fd9-46f7-82fa-d4451e002661/volumes" Apr 16 20:40:34.260051 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:34.259966 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm"] Apr 16 20:40:34.262267 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:34.260287 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" podUID="53913be0-8d2a-4426-87ed-b208eb235786" containerName="switch-graph-653df" containerID="cri-o://0cdcd68b5492d1e52019ff5517212bd5fc3438bf83432ecc945ea0ffad190bc2" gracePeriod=30 Apr 16 20:40:35.177027 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.176994 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hdxwb/must-gather-l24fh"] Apr 16 20:40:35.177344 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.177331 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865d281a-1fd9-46f7-82fa-d4451e002661" containerName="splitter-graph-9a649" Apr 16 20:40:35.177394 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.177345 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="865d281a-1fd9-46f7-82fa-d4451e002661" containerName="splitter-graph-9a649" Apr 16 20:40:35.177429 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.177412 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="865d281a-1fd9-46f7-82fa-d4451e002661" containerName="splitter-graph-9a649" Apr 16 20:40:35.180197 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.180183 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdxwb/must-gather-l24fh" Apr 16 20:40:35.182758 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.182734 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hdxwb\"/\"openshift-service-ca.crt\"" Apr 16 20:40:35.182758 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.182753 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hdxwb\"/\"kube-root-ca.crt\"" Apr 16 20:40:35.183931 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.183914 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-hdxwb\"/\"default-dockercfg-5vnrp\"" Apr 16 20:40:35.200462 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.200441 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hdxwb/must-gather-l24fh"] Apr 16 20:40:35.312413 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.312386 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5njc\" (UniqueName: \"kubernetes.io/projected/75a7507a-7607-4e91-b058-b66ec357eb76-kube-api-access-s5njc\") pod \"must-gather-l24fh\" (UID: \"75a7507a-7607-4e91-b058-b66ec357eb76\") " pod="openshift-must-gather-hdxwb/must-gather-l24fh" Apr 16 20:40:35.312724 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.312451 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75a7507a-7607-4e91-b058-b66ec357eb76-must-gather-output\") pod \"must-gather-l24fh\" (UID: \"75a7507a-7607-4e91-b058-b66ec357eb76\") " pod="openshift-must-gather-hdxwb/must-gather-l24fh" Apr 16 20:40:35.412892 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.412865 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75a7507a-7607-4e91-b058-b66ec357eb76-must-gather-output\") pod \"must-gather-l24fh\" (UID: \"75a7507a-7607-4e91-b058-b66ec357eb76\") " pod="openshift-must-gather-hdxwb/must-gather-l24fh" Apr 16 20:40:35.412999 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.412903 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5njc\" (UniqueName: \"kubernetes.io/projected/75a7507a-7607-4e91-b058-b66ec357eb76-kube-api-access-s5njc\") pod \"must-gather-l24fh\" (UID: \"75a7507a-7607-4e91-b058-b66ec357eb76\") " pod="openshift-must-gather-hdxwb/must-gather-l24fh" Apr 16 20:40:35.413164 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.413148 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75a7507a-7607-4e91-b058-b66ec357eb76-must-gather-output\") pod \"must-gather-l24fh\" (UID: \"75a7507a-7607-4e91-b058-b66ec357eb76\") " pod="openshift-must-gather-hdxwb/must-gather-l24fh" Apr 16 20:40:35.420784 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.420763 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5njc\" (UniqueName: \"kubernetes.io/projected/75a7507a-7607-4e91-b058-b66ec357eb76-kube-api-access-s5njc\") pod \"must-gather-l24fh\" (UID: \"75a7507a-7607-4e91-b058-b66ec357eb76\") " pod="openshift-must-gather-hdxwb/must-gather-l24fh" Apr 16 20:40:35.488867 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.488807 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdxwb/must-gather-l24fh" Apr 16 20:40:35.607710 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.607686 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hdxwb/must-gather-l24fh"] Apr 16 20:40:35.609886 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:40:35.609850 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75a7507a_7607_4e91_b058_b66ec357eb76.slice/crio-ecdba0899897ef4155bbaff98ccc26c491e04971f3227c9bac3910ff64a9d579 WatchSource:0}: Error finding container ecdba0899897ef4155bbaff98ccc26c491e04971f3227c9bac3910ff64a9d579: Status 404 returned error can't find the container with id ecdba0899897ef4155bbaff98ccc26c491e04971f3227c9bac3910ff64a9d579 Apr 16 20:40:35.611653 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:35.611637 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:40:36.338532 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:36.338483 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdxwb/must-gather-l24fh" event={"ID":"75a7507a-7607-4e91-b058-b66ec357eb76","Type":"ContainerStarted","Data":"ecdba0899897ef4155bbaff98ccc26c491e04971f3227c9bac3910ff64a9d579"} Apr 16 20:40:37.485066 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:37.485023 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" podUID="53913be0-8d2a-4426-87ed-b208eb235786" containerName="switch-graph-653df" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:40:40.355234 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:40.355192 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdxwb/must-gather-l24fh" event={"ID":"75a7507a-7607-4e91-b058-b66ec357eb76","Type":"ContainerStarted","Data":"6bd9c19b758426a59eae40846c7b8bf9469a332dde2da346cdd7c5ce9c0a0c38"} Apr 16 20:40:40.355234 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:40.355240 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdxwb/must-gather-l24fh" event={"ID":"75a7507a-7607-4e91-b058-b66ec357eb76","Type":"ContainerStarted","Data":"a34f401b00141b617d3c7db77dc20ccf3bff29320e56bb57035e0732658aca22"} Apr 16 20:40:40.371391 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:40.371343 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hdxwb/must-gather-l24fh" podStartSLOduration=1.324745664 podStartE2EDuration="5.37132766s" podCreationTimestamp="2026-04-16 20:40:35 +0000 UTC" firstStartedPulling="2026-04-16 20:40:35.611753996 +0000 UTC m=+2795.287756050" lastFinishedPulling="2026-04-16 20:40:39.658335979 +0000 UTC m=+2799.334338046" observedRunningTime="2026-04-16 20:40:40.369943454 +0000 UTC m=+2800.045945529" watchObservedRunningTime="2026-04-16 20:40:40.37132766 +0000 UTC m=+2800.047329735" Apr 16 20:40:42.484427 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:42.484379 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" podUID="53913be0-8d2a-4426-87ed-b208eb235786" containerName="switch-graph-653df" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:40:47.485679 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:47.485640 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" podUID="53913be0-8d2a-4426-87ed-b208eb235786" containerName="switch-graph-653df" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:40:47.486156 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:47.485765 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:40:48.436217 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:48.436183 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-653df-fd48d9cd8-p5dbm_53913be0-8d2a-4426-87ed-b208eb235786/switch-graph-653df/0.log" Apr 16 20:40:49.176861 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:49.176826 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-653df-fd48d9cd8-p5dbm_53913be0-8d2a-4426-87ed-b208eb235786/switch-graph-653df/0.log" Apr 16 20:40:49.911717 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:49.911686 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-653df-fd48d9cd8-p5dbm_53913be0-8d2a-4426-87ed-b208eb235786/switch-graph-653df/0.log" Apr 16 20:40:50.620806 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:50.620753 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-653df-fd48d9cd8-p5dbm_53913be0-8d2a-4426-87ed-b208eb235786/switch-graph-653df/0.log" Apr 16 20:40:51.342333 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:51.342307 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-653df-fd48d9cd8-p5dbm_53913be0-8d2a-4426-87ed-b208eb235786/switch-graph-653df/0.log" Apr 16 20:40:52.053212 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:52.053163 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-653df-fd48d9cd8-p5dbm_53913be0-8d2a-4426-87ed-b208eb235786/switch-graph-653df/0.log" Apr 16 20:40:52.485262 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:52.485164 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" podUID="53913be0-8d2a-4426-87ed-b208eb235786" containerName="switch-graph-653df" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:40:52.767410 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:52.767384 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-653df-fd48d9cd8-p5dbm_53913be0-8d2a-4426-87ed-b208eb235786/switch-graph-653df/0.log" Apr 16 20:40:53.478033 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:53.478002 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-653df-fd48d9cd8-p5dbm_53913be0-8d2a-4426-87ed-b208eb235786/switch-graph-653df/0.log" Apr 16 20:40:54.219723 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:54.219694 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-653df-fd48d9cd8-p5dbm_53913be0-8d2a-4426-87ed-b208eb235786/switch-graph-653df/0.log" Apr 16 20:40:54.934233 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:54.934199 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-653df-fd48d9cd8-p5dbm_53913be0-8d2a-4426-87ed-b208eb235786/switch-graph-653df/0.log" Apr 16 20:40:55.694654 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:55.694621 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-653df-fd48d9cd8-p5dbm_53913be0-8d2a-4426-87ed-b208eb235786/switch-graph-653df/0.log" Apr 16 20:40:56.438576 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:56.438541 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-653df-fd48d9cd8-p5dbm_53913be0-8d2a-4426-87ed-b208eb235786/switch-graph-653df/0.log" Apr 16 20:40:57.484048 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:57.484007 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" podUID="53913be0-8d2a-4426-87ed-b208eb235786" containerName="switch-graph-653df" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:40:58.415977 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:58.415896 2568 generic.go:358] "Generic (PLEG): container finished" podID="75a7507a-7607-4e91-b058-b66ec357eb76" containerID="a34f401b00141b617d3c7db77dc20ccf3bff29320e56bb57035e0732658aca22" exitCode=0 Apr 16 20:40:58.415977 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:58.415967 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hdxwb/must-gather-l24fh" event={"ID":"75a7507a-7607-4e91-b058-b66ec357eb76","Type":"ContainerDied","Data":"a34f401b00141b617d3c7db77dc20ccf3bff29320e56bb57035e0732658aca22"} Apr 16 20:40:58.416306 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:58.416292 2568 scope.go:117] "RemoveContainer" containerID="a34f401b00141b617d3c7db77dc20ccf3bff29320e56bb57035e0732658aca22" Apr 16 20:40:59.044008 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:40:59.043977 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hdxwb_must-gather-l24fh_75a7507a-7607-4e91-b058-b66ec357eb76/gather/0.log" Apr 16 20:41:02.382916 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:02.382886 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-2dpz7_6d82b3f2-f3ee-4cdc-98fe-a6b57d616a0a/global-pull-secret-syncer/0.log" Apr 16 20:41:02.484408 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:02.484368 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" podUID="53913be0-8d2a-4426-87ed-b208eb235786" containerName="switch-graph-653df" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:41:02.573580 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:02.573550 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nnsm9_9fa5d1b5-3f07-41ed-81f1-cd7e2a96551a/konnectivity-agent/0.log" Apr 16 20:41:02.674425 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:02.674330 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-191.ec2.internal_ab1d9840308a9e93932284ca7f6a67ee/haproxy/0.log" Apr 16 20:41:04.415402 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.415381 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:41:04.434056 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.434023 2568 generic.go:358] "Generic (PLEG): container finished" podID="53913be0-8d2a-4426-87ed-b208eb235786" containerID="0cdcd68b5492d1e52019ff5517212bd5fc3438bf83432ecc945ea0ffad190bc2" exitCode=0 Apr 16 20:41:04.434191 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.434092 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" Apr 16 20:41:04.434191 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.434116 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" event={"ID":"53913be0-8d2a-4426-87ed-b208eb235786","Type":"ContainerDied","Data":"0cdcd68b5492d1e52019ff5517212bd5fc3438bf83432ecc945ea0ffad190bc2"} Apr 16 20:41:04.434191 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.434157 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm" event={"ID":"53913be0-8d2a-4426-87ed-b208eb235786","Type":"ContainerDied","Data":"e99ecfd1a8083b94b7e00df0a8e61f8db21e00d1734c3de9859dffa93d655f10"} Apr 16 20:41:04.434311 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.434197 2568 scope.go:117] "RemoveContainer" containerID="0cdcd68b5492d1e52019ff5517212bd5fc3438bf83432ecc945ea0ffad190bc2" Apr 16 20:41:04.442034 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.442013 2568 scope.go:117] "RemoveContainer" containerID="0cdcd68b5492d1e52019ff5517212bd5fc3438bf83432ecc945ea0ffad190bc2" Apr 16 20:41:04.442336 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:41:04.442312 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cdcd68b5492d1e52019ff5517212bd5fc3438bf83432ecc945ea0ffad190bc2\": container with ID starting with 0cdcd68b5492d1e52019ff5517212bd5fc3438bf83432ecc945ea0ffad190bc2 not found: ID does not exist" containerID="0cdcd68b5492d1e52019ff5517212bd5fc3438bf83432ecc945ea0ffad190bc2" Apr 16 20:41:04.442423 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.442347 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cdcd68b5492d1e52019ff5517212bd5fc3438bf83432ecc945ea0ffad190bc2"} err="failed to get container status \"0cdcd68b5492d1e52019ff5517212bd5fc3438bf83432ecc945ea0ffad190bc2\": rpc error: code = NotFound desc = could not find container \"0cdcd68b5492d1e52019ff5517212bd5fc3438bf83432ecc945ea0ffad190bc2\": container with ID starting with 0cdcd68b5492d1e52019ff5517212bd5fc3438bf83432ecc945ea0ffad190bc2 not found: ID does not exist" Apr 16 20:41:04.450725 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.450705 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53913be0-8d2a-4426-87ed-b208eb235786-openshift-service-ca-bundle\") pod \"53913be0-8d2a-4426-87ed-b208eb235786\" (UID: \"53913be0-8d2a-4426-87ed-b208eb235786\") " Apr 16 20:41:04.450832 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.450738 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53913be0-8d2a-4426-87ed-b208eb235786-proxy-tls\") pod \"53913be0-8d2a-4426-87ed-b208eb235786\" (UID: \"53913be0-8d2a-4426-87ed-b208eb235786\") " Apr 16 20:41:04.451125 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.451098 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53913be0-8d2a-4426-87ed-b208eb235786-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "53913be0-8d2a-4426-87ed-b208eb235786" (UID: "53913be0-8d2a-4426-87ed-b208eb235786"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:41:04.452775 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.452751 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53913be0-8d2a-4426-87ed-b208eb235786-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "53913be0-8d2a-4426-87ed-b208eb235786" (UID: "53913be0-8d2a-4426-87ed-b208eb235786"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:41:04.499042 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.499014 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hdxwb/must-gather-l24fh"] Apr 16 20:41:04.499267 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.499246 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-hdxwb/must-gather-l24fh" podUID="75a7507a-7607-4e91-b058-b66ec357eb76" containerName="copy" containerID="cri-o://6bd9c19b758426a59eae40846c7b8bf9469a332dde2da346cdd7c5ce9c0a0c38" gracePeriod=2 Apr 16 20:41:04.503565 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.503541 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hdxwb/must-gather-l24fh"] Apr 16 20:41:04.552002 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.551932 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53913be0-8d2a-4426-87ed-b208eb235786-openshift-service-ca-bundle\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:41:04.552002 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.551968 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53913be0-8d2a-4426-87ed-b208eb235786-proxy-tls\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:41:04.714437 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.714415 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hdxwb_must-gather-l24fh_75a7507a-7607-4e91-b058-b66ec357eb76/copy/0.log" Apr 16 20:41:04.714794 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.714779 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdxwb/must-gather-l24fh" Apr 16 20:41:04.716813 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.716790 2568 status_manager.go:895] "Failed to get status for pod" podUID="75a7507a-7607-4e91-b058-b66ec357eb76" pod="openshift-must-gather-hdxwb/must-gather-l24fh" err="pods \"must-gather-l24fh\" is forbidden: User \"system:node:ip-10-0-140-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hdxwb\": no relationship found between node 'ip-10-0-140-191.ec2.internal' and this object" Apr 16 20:41:04.744276 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.744249 2568 status_manager.go:895] "Failed to get status for pod" podUID="75a7507a-7607-4e91-b058-b66ec357eb76" pod="openshift-must-gather-hdxwb/must-gather-l24fh" err="pods \"must-gather-l24fh\" is forbidden: User \"system:node:ip-10-0-140-191.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hdxwb\": no relationship found between node 'ip-10-0-140-191.ec2.internal' and this object" Apr 16 20:41:04.753244 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.753229 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75a7507a-7607-4e91-b058-b66ec357eb76-must-gather-output\") pod \"75a7507a-7607-4e91-b058-b66ec357eb76\" (UID: \"75a7507a-7607-4e91-b058-b66ec357eb76\") " Apr 16 20:41:04.753326 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.753276 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5njc\" (UniqueName: \"kubernetes.io/projected/75a7507a-7607-4e91-b058-b66ec357eb76-kube-api-access-s5njc\") pod \"75a7507a-7607-4e91-b058-b66ec357eb76\" (UID: \"75a7507a-7607-4e91-b058-b66ec357eb76\") " Apr 16 20:41:04.754991 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.754968 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75a7507a-7607-4e91-b058-b66ec357eb76-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "75a7507a-7607-4e91-b058-b66ec357eb76" (UID: "75a7507a-7607-4e91-b058-b66ec357eb76"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:41:04.755836 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.755818 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a7507a-7607-4e91-b058-b66ec357eb76-kube-api-access-s5njc" (OuterVolumeSpecName: "kube-api-access-s5njc") pod "75a7507a-7607-4e91-b058-b66ec357eb76" (UID: "75a7507a-7607-4e91-b058-b66ec357eb76"). InnerVolumeSpecName "kube-api-access-s5njc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:41:04.756491 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.756471 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm"] Apr 16 20:41:04.758587 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.758566 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-653df-fd48d9cd8-p5dbm"] Apr 16 20:41:04.854521 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.854457 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s5njc\" (UniqueName: \"kubernetes.io/projected/75a7507a-7607-4e91-b058-b66ec357eb76-kube-api-access-s5njc\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:41:04.854521 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.854479 2568 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75a7507a-7607-4e91-b058-b66ec357eb76-must-gather-output\") on node \"ip-10-0-140-191.ec2.internal\" DevicePath \"\"" Apr 16 20:41:04.896570 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.896543 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53913be0-8d2a-4426-87ed-b208eb235786" path="/var/lib/kubelet/pods/53913be0-8d2a-4426-87ed-b208eb235786/volumes" Apr 16 20:41:04.896887 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:04.896875 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75a7507a-7607-4e91-b058-b66ec357eb76" path="/var/lib/kubelet/pods/75a7507a-7607-4e91-b058-b66ec357eb76/volumes" Apr 16 20:41:05.438102 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.438068 2568 generic.go:358] "Generic (PLEG): container finished" podID="75a7507a-7607-4e91-b058-b66ec357eb76" containerID="6bd9c19b758426a59eae40846c7b8bf9469a332dde2da346cdd7c5ce9c0a0c38" exitCode=143 Apr 16 20:41:05.438542 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.438117 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hdxwb/must-gather-l24fh" Apr 16 20:41:05.438542 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.438188 2568 scope.go:117] "RemoveContainer" containerID="6bd9c19b758426a59eae40846c7b8bf9469a332dde2da346cdd7c5ce9c0a0c38" Apr 16 20:41:05.445048 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.445029 2568 scope.go:117] "RemoveContainer" containerID="a34f401b00141b617d3c7db77dc20ccf3bff29320e56bb57035e0732658aca22" Apr 16 20:41:05.456745 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.456719 2568 scope.go:117] "RemoveContainer" containerID="6bd9c19b758426a59eae40846c7b8bf9469a332dde2da346cdd7c5ce9c0a0c38" Apr 16 20:41:05.456982 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:41:05.456963 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bd9c19b758426a59eae40846c7b8bf9469a332dde2da346cdd7c5ce9c0a0c38\": container with ID starting with 6bd9c19b758426a59eae40846c7b8bf9469a332dde2da346cdd7c5ce9c0a0c38 not found: ID does not exist" containerID="6bd9c19b758426a59eae40846c7b8bf9469a332dde2da346cdd7c5ce9c0a0c38" Apr 16 20:41:05.457029 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.456991 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd9c19b758426a59eae40846c7b8bf9469a332dde2da346cdd7c5ce9c0a0c38"} err="failed to get container status \"6bd9c19b758426a59eae40846c7b8bf9469a332dde2da346cdd7c5ce9c0a0c38\": rpc error: code = NotFound desc = could not find container \"6bd9c19b758426a59eae40846c7b8bf9469a332dde2da346cdd7c5ce9c0a0c38\": container with ID starting with 6bd9c19b758426a59eae40846c7b8bf9469a332dde2da346cdd7c5ce9c0a0c38 not found: ID does not exist" Apr 16 20:41:05.457029 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.457017 2568 scope.go:117] "RemoveContainer" containerID="a34f401b00141b617d3c7db77dc20ccf3bff29320e56bb57035e0732658aca22" Apr 16 20:41:05.457252 ip-10-0-140-191 kubenswrapper[2568]: E0416 20:41:05.457234 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a34f401b00141b617d3c7db77dc20ccf3bff29320e56bb57035e0732658aca22\": container with ID starting with a34f401b00141b617d3c7db77dc20ccf3bff29320e56bb57035e0732658aca22 not found: ID does not exist" containerID="a34f401b00141b617d3c7db77dc20ccf3bff29320e56bb57035e0732658aca22" Apr 16 20:41:05.457309 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.457260 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a34f401b00141b617d3c7db77dc20ccf3bff29320e56bb57035e0732658aca22"} err="failed to get container status \"a34f401b00141b617d3c7db77dc20ccf3bff29320e56bb57035e0732658aca22\": rpc error: code = NotFound desc = could not find container \"a34f401b00141b617d3c7db77dc20ccf3bff29320e56bb57035e0732658aca22\": container with ID starting with a34f401b00141b617d3c7db77dc20ccf3bff29320e56bb57035e0732658aca22 not found: ID does not exist" Apr 16 20:41:05.785802 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.785773 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0adcc642-de3f-4474-9dc5-282398ec9c9e/alertmanager/0.log" Apr 16 20:41:05.821814 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.821789 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0adcc642-de3f-4474-9dc5-282398ec9c9e/config-reloader/0.log" Apr 16 20:41:05.847306 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.847283 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0adcc642-de3f-4474-9dc5-282398ec9c9e/kube-rbac-proxy-web/0.log" Apr 16 20:41:05.874878 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.874850 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0adcc642-de3f-4474-9dc5-282398ec9c9e/kube-rbac-proxy/0.log" Apr 16 20:41:05.901657 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.901633 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0adcc642-de3f-4474-9dc5-282398ec9c9e/kube-rbac-proxy-metric/0.log" Apr 16 20:41:05.931038 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.931020 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0adcc642-de3f-4474-9dc5-282398ec9c9e/prom-label-proxy/0.log" Apr 16 20:41:05.957590 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:05.957567 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0adcc642-de3f-4474-9dc5-282398ec9c9e/init-config-reloader/0.log" Apr 16 20:41:06.019137 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.019111 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-z6zfs_943336c5-9e13-4677-a528-a07e32a448ef/kube-state-metrics/0.log" Apr 16 20:41:06.044015 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.043949 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-z6zfs_943336c5-9e13-4677-a528-a07e32a448ef/kube-rbac-proxy-main/0.log" Apr 16 20:41:06.065932 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.065911 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-z6zfs_943336c5-9e13-4677-a528-a07e32a448ef/kube-rbac-proxy-self/0.log" Apr 16 20:41:06.100298 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.100272 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7878fffd95-cgn9r_6ebba194-30e5-4b1f-bdee-29507d5ed72d/metrics-server/0.log" Apr 16 20:41:06.126804 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.126783 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-kfvs5_0c00b183-44d6-4f42-be34-b7f63056fa91/monitoring-plugin/0.log" Apr 16 20:41:06.157761 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.157726 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-97w44_e2549d95-7be7-41cf-859c-0af719d66591/node-exporter/0.log" Apr 16 20:41:06.177213 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.177191 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-97w44_e2549d95-7be7-41cf-859c-0af719d66591/kube-rbac-proxy/0.log" Apr 16 20:41:06.200940 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.200920 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-97w44_e2549d95-7be7-41cf-859c-0af719d66591/init-textfile/0.log" Apr 16 20:41:06.639121 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.639092 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gqkzx_ccb4b340-2600-42d0-af5c-929cb99cf57c/prometheus-operator/0.log" Apr 16 20:41:06.658147 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.658124 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gqkzx_ccb4b340-2600-42d0-af5c-929cb99cf57c/kube-rbac-proxy/0.log" Apr 16 20:41:06.682472 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.682455 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-b9ndm_014c398b-8d9f-4b56-a942-ad2aadadf2f9/prometheus-operator-admission-webhook/0.log" Apr 16 20:41:06.709265 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.709245 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b7cd7d77f-2jlnv_824d29df-c9d9-42ca-b9b1-94f18a2e17ee/telemeter-client/0.log" Apr 16 20:41:06.732145 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.732112 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b7cd7d77f-2jlnv_824d29df-c9d9-42ca-b9b1-94f18a2e17ee/reload/0.log" Apr 16 20:41:06.753826 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:06.753801 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b7cd7d77f-2jlnv_824d29df-c9d9-42ca-b9b1-94f18a2e17ee/kube-rbac-proxy/0.log" Apr 16 20:41:08.924123 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:08.924041 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69f97474bd-lch2q_00287a79-a46a-4dd3-bfcd-31bd74b8cf70/console/0.log" Apr 16 20:41:08.964521 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:08.964489 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-wf6ls_96e5dd67-29ce-447d-b662-38afb458d283/download-server/0.log" Apr 16 20:41:09.633152 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.633122 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb"] Apr 16 20:41:09.633468 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.633455 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53913be0-8d2a-4426-87ed-b208eb235786" containerName="switch-graph-653df" Apr 16 20:41:09.633523 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.633471 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="53913be0-8d2a-4426-87ed-b208eb235786" containerName="switch-graph-653df" Apr 16 20:41:09.633523 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.633491 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75a7507a-7607-4e91-b058-b66ec357eb76" containerName="gather" Apr 16 20:41:09.633523 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.633497 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a7507a-7607-4e91-b058-b66ec357eb76" containerName="gather" Apr 16 20:41:09.633634 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.633532 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75a7507a-7607-4e91-b058-b66ec357eb76" containerName="copy" Apr 16 20:41:09.633634 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.633541 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a7507a-7607-4e91-b058-b66ec357eb76" containerName="copy" Apr 16 20:41:09.633753 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.633660 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="75a7507a-7607-4e91-b058-b66ec357eb76" containerName="gather" Apr 16 20:41:09.633753 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.633682 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="53913be0-8d2a-4426-87ed-b208eb235786" containerName="switch-graph-653df" Apr 16 20:41:09.633753 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.633693 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="75a7507a-7607-4e91-b058-b66ec357eb76" containerName="copy" Apr 16 20:41:09.638913 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.638897 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.642651 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.642631 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zn2n8\"/\"kube-root-ca.crt\"" Apr 16 20:41:09.642779 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.642697 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zn2n8\"/\"default-dockercfg-n9vjb\"" Apr 16 20:41:09.642779 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.642754 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zn2n8\"/\"openshift-service-ca.crt\"" Apr 16 20:41:09.648866 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.648846 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb"] Apr 16 20:41:09.695815 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.695786 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/329335e9-9451-4522-a668-aac087fcae2c-lib-modules\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.695933 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.695825 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6w7l\" (UniqueName: \"kubernetes.io/projected/329335e9-9451-4522-a668-aac087fcae2c-kube-api-access-s6w7l\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.695933 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.695853 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/329335e9-9451-4522-a668-aac087fcae2c-podres\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.695933 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.695892 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/329335e9-9451-4522-a668-aac087fcae2c-sys\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.695933 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.695921 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/329335e9-9451-4522-a668-aac087fcae2c-proc\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.797052 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.797025 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/329335e9-9451-4522-a668-aac087fcae2c-sys\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.797216 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.797058 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/329335e9-9451-4522-a668-aac087fcae2c-proc\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.797216 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.797088 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/329335e9-9451-4522-a668-aac087fcae2c-lib-modules\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.797216 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.797115 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6w7l\" (UniqueName: \"kubernetes.io/projected/329335e9-9451-4522-a668-aac087fcae2c-kube-api-access-s6w7l\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.797216 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.797149 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/329335e9-9451-4522-a668-aac087fcae2c-proc\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.797216 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.797151 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/329335e9-9451-4522-a668-aac087fcae2c-sys\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.797216 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.797202 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/329335e9-9451-4522-a668-aac087fcae2c-podres\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.797495 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.797245 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/329335e9-9451-4522-a668-aac087fcae2c-lib-modules\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.797495 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.797289 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/329335e9-9451-4522-a668-aac087fcae2c-podres\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.805376 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.805359 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6w7l\" (UniqueName: \"kubernetes.io/projected/329335e9-9451-4522-a668-aac087fcae2c-kube-api-access-s6w7l\") pod \"perf-node-gather-daemonset-tdsbb\" (UID: \"329335e9-9451-4522-a668-aac087fcae2c\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:09.948954 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:09.948877 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:10.039083 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:10.039057 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jhdrk_d94f7b13-7594-474a-a4d5-fc1f6d448d66/dns/0.log" Apr 16 20:41:10.060826 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:10.060798 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jhdrk_d94f7b13-7594-474a-a4d5-fc1f6d448d66/kube-rbac-proxy/0.log" Apr 16 20:41:10.066792 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:10.066771 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb"] Apr 16 20:41:10.070828 ip-10-0-140-191 kubenswrapper[2568]: W0416 20:41:10.070795 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod329335e9_9451_4522_a668_aac087fcae2c.slice/crio-479a90894eec91a64745ec1a100fc58e4462efbedd8c172ea45f095404a419f1 WatchSource:0}: Error finding container 479a90894eec91a64745ec1a100fc58e4462efbedd8c172ea45f095404a419f1: Status 404 returned error can't find the container with id 479a90894eec91a64745ec1a100fc58e4462efbedd8c172ea45f095404a419f1 Apr 16 20:41:10.187406 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:10.187383 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-t5cbx_51444d65-22cd-418d-af7c-4510ee2ee6d2/dns-node-resolver/0.log" Apr 16 20:41:10.456446 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:10.456408 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" event={"ID":"329335e9-9451-4522-a668-aac087fcae2c","Type":"ContainerStarted","Data":"f735b723b08b44fb97242428c39042ef988e7d09a4c9a6df2644a8ce1ac2d4d3"} Apr 16 20:41:10.456446 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:10.456445 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" event={"ID":"329335e9-9451-4522-a668-aac087fcae2c","Type":"ContainerStarted","Data":"479a90894eec91a64745ec1a100fc58e4462efbedd8c172ea45f095404a419f1"} Apr 16 20:41:10.456694 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:10.456491 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:10.472633 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:10.472589 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" podStartSLOduration=1.4725778250000001 podStartE2EDuration="1.472577825s" podCreationTimestamp="2026-04-16 20:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:41:10.471459786 +0000 UTC m=+2830.147461862" watchObservedRunningTime="2026-04-16 20:41:10.472577825 +0000 UTC m=+2830.148579900" Apr 16 20:41:10.612713 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:10.612683 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mcvlm_b352b6f3-ece0-4811-9bbf-e58c2cfe8081/node-ca/0.log" Apr 16 20:41:11.612071 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:11.612034 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5xgdl_fded5762-e4ff-4f63-94bd-04c5209ebead/serve-healthcheck-canary/0.log" Apr 16 20:41:12.171406 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:12.171368 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qsrkv_1050de2f-179f-4060-953f-7c1c76584100/kube-rbac-proxy/0.log" Apr 16 20:41:12.200088 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:12.200057 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qsrkv_1050de2f-179f-4060-953f-7c1c76584100/exporter/0.log" Apr 16 20:41:12.223845 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:12.223823 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qsrkv_1050de2f-179f-4060-953f-7c1c76584100/extractor/0.log" Apr 16 20:41:14.261110 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:14.261085 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-d8j7f_e9c7327f-6025-406b-a2f1-d593ef74741e/manager/0.log" Apr 16 20:41:14.731133 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:14.731052 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-ttt7q_f090f732-3d0f-4172-812d-a4b0a0370733/manager/0.log" Apr 16 20:41:14.751505 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:14.751480 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-96bbb_9bf30547-62ef-4eea-b806-6a6eabe22ef3/s3-init/0.log" Apr 16 20:41:16.468962 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:16.468932 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-tdsbb" Apr 16 20:41:19.868599 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:19.868564 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5zhw7_b7555a2d-0c33-4639-a546-dc00100629cf/kube-multus/0.log" Apr 16 20:41:20.223976 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:20.223909 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p856m_bf55464c-e6ac-41d5-98de-59d9df6a82e0/kube-multus-additional-cni-plugins/0.log" Apr 16 20:41:20.249189 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:20.249150 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p856m_bf55464c-e6ac-41d5-98de-59d9df6a82e0/egress-router-binary-copy/0.log" Apr 16 20:41:20.274442 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:20.274418 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p856m_bf55464c-e6ac-41d5-98de-59d9df6a82e0/cni-plugins/0.log" Apr 16 20:41:20.296640 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:20.296618 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p856m_bf55464c-e6ac-41d5-98de-59d9df6a82e0/bond-cni-plugin/0.log" Apr 16 20:41:20.316750 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:20.316727 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p856m_bf55464c-e6ac-41d5-98de-59d9df6a82e0/routeoverride-cni/0.log" Apr 16 20:41:20.337240 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:20.337216 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p856m_bf55464c-e6ac-41d5-98de-59d9df6a82e0/whereabouts-cni-bincopy/0.log" Apr 16 20:41:20.358574 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:20.358554 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p856m_bf55464c-e6ac-41d5-98de-59d9df6a82e0/whereabouts-cni/0.log" Apr 16 20:41:20.492017 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:20.491937 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-v62bb_12ed67c2-088e-47ad-b2f4-d5da475ea9fc/network-metrics-daemon/0.log" Apr 16 20:41:20.516141 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:20.516110 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-v62bb_12ed67c2-088e-47ad-b2f4-d5da475ea9fc/kube-rbac-proxy/0.log" Apr 16 20:41:21.269767 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:21.269714 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cspns_ede74e32-9e13-4250-9116-a7ce9f6af0a6/ovn-controller/0.log" Apr 16 20:41:21.317468 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:21.317440 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cspns_ede74e32-9e13-4250-9116-a7ce9f6af0a6/ovn-acl-logging/0.log" Apr 16 20:41:21.338284 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:21.338261 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cspns_ede74e32-9e13-4250-9116-a7ce9f6af0a6/kube-rbac-proxy-node/0.log" Apr 16 20:41:21.358751 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:21.358728 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cspns_ede74e32-9e13-4250-9116-a7ce9f6af0a6/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:41:21.377047 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:21.377022 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cspns_ede74e32-9e13-4250-9116-a7ce9f6af0a6/northd/0.log" Apr 16 20:41:21.396151 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:21.396126 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cspns_ede74e32-9e13-4250-9116-a7ce9f6af0a6/nbdb/0.log" Apr 16 20:41:21.415925 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:21.415907 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cspns_ede74e32-9e13-4250-9116-a7ce9f6af0a6/sbdb/0.log" Apr 16 20:41:21.601470 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:21.601394 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cspns_ede74e32-9e13-4250-9116-a7ce9f6af0a6/ovnkube-controller/0.log" Apr 16 20:41:23.068471 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:23.068442 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-dpc5h_fe9d3b1f-3e6a-4f4c-86f4-1a3f245813b4/network-check-target-container/0.log" Apr 16 20:41:23.950826 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:23.950797 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-4fs8r_cc946656-c72d-400e-a2cc-76aa86a4b014/iptables-alerter/0.log" Apr 16 20:41:24.586532 ip-10-0-140-191 kubenswrapper[2568]: I0416 20:41:24.586509 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-cqztz_5b914463-981e-407b-9d5c-37f855389e30/tuned/0.log"